Skip to content

jamesrochabrun/SwiftOpenAICLI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

83 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SwiftOpenAI-CLI

Version

Image

macOS 13+ Linux Buy me a coffee

A command-line interface for interacting with OpenAI's API, built with Swift.

Features

  • 💬 Chat - Interactive conversations with GPT models
  • 🤖 Agent Mode - AI agent with MCP tool integration and conversation memory
  • 🔌 MCP Integration - Connect to GitHub, databases, Slack, and more via Model Context Protocol
  • 🚀 GPT-5 Support - Advanced reasoning and verbosity controls for GPT-5 models
  • 🌐 Multi-Provider Support - Works with OpenAI, xAI (Grok), Groq, DeepSeek, OpenRouter, and more
  • 🎯 Provider-Aware Models - Automatically shows appropriate models for your configured provider
  • 🐞 Debug Mode - Toggle HTTP status codes and headers visibility for troubleshooting
  • 🌡️ Temperature Control - Configure default temperature in settings, override per command
  • 🖼️ Image Generation - Generate images with AI models
  • 📊 Models - List and filter available models with custom model support
  • 🔤 Completions - Generate text completions
  • 🧮 Embeddings - Generate text embeddings
  • ⚙️ Configuration - Interactive setup wizard and fine-grained settings control

Installation

Using npm (Recommended)

The easiest way to install SwiftOpenAI-CLI is via npm:

npm install -g swiftopenai-cli

That's it! The swiftopenai command is now available globally.

Platform Support:

  • ✅ macOS (Apple Silicon M1/M2/M3)
  • ⚠️ macOS (Intel) - Requires Rosetta 2
  • ❌ Linux - Use "Build from Source" below

Build from Source

Perfect for developers, contributors, or if you need to run on Linux.

  1. Clone the repository:
git clone https://github.com/jamesrochabrun/SwiftOpenAICLI.git
cd SwiftOpenAICLI
  1. Build the project:
swift build -c release
  1. Install the binary:
cp .build/release/swiftopenai /usr/local/bin/

Or run directly without installing:

swift run swiftopenai "Hello, world!"

Alternative Installation Methods

Using Mint

Mint is a package manager for Swift command-line tools.

  1. Install Mint:
brew install mint
  1. Install SwiftOpenAI-CLI:
mint install jamesrochabrun/SwiftOpenAICLI

Note: You'll need to add Mint's bin directory to your PATH:

export PATH="$HOME/.mint/bin:$PATH"
Debug Build Information

The CLI includes debug output when built in debug mode:

  • Full curl commands for API requests
  • HTTP response headers and status codes
  • Raw JSON responses from the API

To build with debug output:

swift build  # Debug mode
swift build -c release  # Release mode (recommended)

Updating

For npm installations

Check if an update is available:

npm outdated -g swiftopenai-cli

Update to the latest version:

npm update -g swiftopenai-cli

Or force update to the latest version:

npm install -g swiftopenai-cli@latest

For source builds

Pull the latest changes and rebuild:

cd SwiftOpenAICLI
git pull
swift build -c release
cp .build/release/swiftopenai /usr/local/bin/

Verify your version

After updating, confirm the new version:

swiftopenai --version

Note: Check the releases page for any breaking changes before updating.

Configuration

Quick Setup (Recommended)

Use the interactive configuration setup:

From Command Line

swiftopenai config setup

From Interactive Mode

# In chat or agent mode
You: /config setup

# In ISA
isa
You: /config setup

Both methods launch the same interactive wizard that guides you through:

  • Selecting your AI provider (OpenAI, xAI/Grok, Groq, DeepSeek, etc.)
  • Entering your API key
  • Choosing the default model
  • Configuring debug mode (show/hide HTTP status codes)
  • Setting up the API endpoint (if needed)

Manual Configuration

Set API Key

# For OpenAI (default)
swiftopenai config set api-key sk-...

# For other providers (set provider first)
swiftopenai config set provider xai
swiftopenai config set api-key xai-...
swiftopenai config set base-url https://api.x.ai/v1

Environment Variables

The CLI respects provider-specific environment variables:

  • OpenAI: OPENAI_API_KEY
  • xAI/Grok: XAI_API_KEY
  • Groq: GROQ_API_KEY
  • DeepSeek: DEEPSEEK_API_KEY
  • Anthropic: ANTHROPIC_API_KEY
  • OpenRouter: OPENROUTER_API_KEY

Note: Provider-specific environment variables take precedence over config file when the corresponding provider is selected.

Supported Providers

Provider Base URL Example Models
OpenAI (default) gpt-4o, gpt-5, gpt-5-mini, o1-preview
xAI/Grok https://api.x.ai/v1 grok-4-0709, grok-3, grok-code-fast-1
Groq https://api.groq.com/openai/v1 llama-3.3-70b-versatile, mixtral-8x7b
DeepSeek https://api.deepseek.com deepseek-chat, deepseek-coder
OpenRouter https://openrouter.ai/api/v1 anthropic/claude-3.5-sonnet, gemini-pro
Anthropic* https://openrouter.ai/api/v1 claude-3.5-sonnet, claude-3.5-haiku
Custom (your URL) Any OpenAI-compatible model

*Via OpenRouter for OpenAI-compatible API

Configuration Options

Temperature

Controls the randomness of AI responses (0.0 = deterministic, 2.0 = very creative)

# Set default temperature
swiftopenai config set temperature 0.8

# Override per command
swiftopenai chat "Write a poem" --temperature 1.5

Debug Mode

Shows HTTP status codes and headers for API calls

# Enable debug mode
swiftopenai config set debug true

# Disable debug mode
swiftopenai config set debug false

View Configuration

# List all settings
swiftopenai config list

# Get specific setting
swiftopenai config get default-model

Usage

Chat

# Simple chat
swiftopenai "What is the capital of France?"

# Plain output without formatting (useful for scripts)
swiftopenai -p "What is the capital of France?"

# Interactive mode
swiftopenai chat --interactive

# Interactive mode with plain output
swiftopenai chat \
  --interactive \
  --plain

# With specific model
swiftopenai chat "Explain quantum computing" \
  --model gpt-4o

# With GPT-5 models (supports both formats)
swiftopenai chat "Write a function" \
  --model gpt5 \
  --reasoning minimal

swiftopenai chat "Explain this concept" \
  --model gpt-5-mini \
  --verbose high

# With system prompt
swiftopenai chat "How do I sort an array?" \
  --system "You are a helpful assistant"

Provider Examples

# Using xAI (Grok)
swiftopenai config set provider xai
swiftopenai chat "Explain relativity" --model grok-4-0709

# Using Groq with Llama
swiftopenai config set provider groq
swiftopenai chat "Write Python code" --model llama-3.3-70b-versatile

# Using DeepSeek for coding
swiftopenai config set provider deepseek
swiftopenai chat "Debug this function" --model deepseek-coder

# Custom model with any provider
swiftopenai chat "Hello" --model my-custom-model-v2

Agent Mode 🤖

AI agent with MCP (Model Context Protocol) integration for external tools, conversation memory, and auto-compaction for infinite conversations.

Basic Usage

# Simple agent command (uses GPT-5 by default)
swiftopenai agent "Calculate 25 * 37 and tell me what day it is today"

# With specific model
swiftopenai agent "What's the weather like?" --model gpt-4o-mini

# Interactive agent mode
swiftopenai agent --interactive

# Interactive with tool events visible
swiftopenai agent --interactive --show-tool-events

# Use with MCP servers for tools (--allowed-tools required)
swiftopenai agent "Read the config.json file" \
  --mcp-servers filesystem \
  --allowed-tools "mcp__filesystem__*"

Advanced Usage

# Stream JSON events (like Claude SDK)
swiftopenai agent "Calculate the square root of 144" \
  --output-format stream-json

# With session ID for conversation continuity
swiftopenai agent "My name is Alice" \
  --session-id abc123

swiftopenai agent "What's my name?" \
  --session-id abc123  # Remembers Alice

# GPT-5 with verbosity and reasoning controls
swiftopenai agent "Explain quantum computing" \
  --model gpt-5 \
  --model-verbosity high \
  --reasoning high

# MCP tools with JSON output
swiftopenai agent "Query the database for recent orders" \
  --mcp-servers postgres \
  --output-format json

# With custom system prompt
swiftopenai agent "Help me debug this" \
  --system "You are an expert programmer" \
  --model gpt-5-mini

Output Formats

  • plain (default) - Human-readable output with colored formatting
  • json - Structured JSON response with metadata
  • stream-json - Event stream for each tool call and response (Claude SDK style)

Interactive Mode Features

swiftopenai agent --interactive --model gpt-5 --show-tool-events
  • Conversation Memory - Maintains context within session
  • Auto-Compaction - Automatically summarizes long conversations at 92% capacity
  • Context Warnings - Shows capacity usage (e.g., "💭 85% capacity (7% until auto-compacting)")
  • Slash Commands:
    • /config - View and manage configuration
    • /config setup - Interactive provider setup wizard
    • /models - List provider-specific models with custom option
    • /models <name> - Set any model directly (e.g., /models grok-4-0709)
    • /help - Show available commands
  • Control Commands:
    • clear - Reset conversation history
    • exit or quit - Exit interactive mode
    • Ctrl+C - Interrupt and exit
    • Ctrl+D - EOF exit
Interactive Configuration Example
$ swiftopenai chat --interactive
🤖 OpenAI Chat (gpt-4o)
Type 'exit' to quit, 'clear' to clear history

You: /config
⚙️  Current Configuration
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
• api-key:        sk-...xxxx
• default-model:  gpt-4o
• temperature:    1.0
• max-tokens:     (default)
• animated-loading: true
• provider:       openai
• debug:          disabled

💡 Usage: /config <key> <value>
   Example: /config temperature 0.7

You: /config temperature 0.5
✅ Set temperature = 0.5

You: /config setup
[Interactive provider setup wizard launches...]

You: /models
Select Model
━━━━━━━━━━━━━━━━━━━━━━━━━━━━
 > grok-4-0709 - Grok-4 • Language model
   grok-3 - Grok-3 • General purpose
   grok-code-fast-1 - Fast code generation • Optimized [x]
   Custom model... - Enter a custom model name

[↑/↓ to navigate, Enter to select, ESC to cancel]

You: /models grok-2-vision-1212
✅ Model changed to: grok-2-vision-1212
   Using custom model

Session Management

# Start new conversation
swiftopenai agent "Hello, I'm working on a Swift project" \
  --session-id work-session

# Continue conversation (remembers context)
swiftopenai agent "What language did I mention?" \
  --session-id work-session

# Auto-compaction keeps conversations infinite
# When reaching context limit, automatically summarizes with GPT-5-mini/gpt-4o-mini

Context Windows

  • GPT-5 models: 400K tokens
  • GPT-4o models: 128K tokens
  • Auto-compaction triggers at 92% capacity
  • Fallback chain: GPT-5-mini → gpt-4o-mini → gpt-3.5-turbo

Real-World Examples

# Complex task with MCP tools (--allowed-tools required)
swiftopenai agent "List all my GitHub repositories and create an issue about updating documentation" \
  --mcp-servers github \
  --allowed-tools "mcp__github__*"

# Code analysis with filesystem MCP
swiftopenai agent "Read the Package.swift file and explain what dependencies this project uses" \
  --mcp-servers filesystem \
  --allowed-tools "mcp__filesystem__*"

# Interactive problem-solving session
swiftopenai agent \
  --interactive \
  --model gpt-5 \
  --show-tool-events
# You: I need to plan a project that starts today
# You: Calculate how many work days are in the next 30 days
# You: What's the date 30 days from now?

# Production monitoring with MCP and JSON output
swiftopenai agent "Query the database for system metrics and calculate uptime percentage" \
  --output-format json \
  --mcp-servers postgres

ISA - Intelligent Software Assistant 🤖

ISA is a Claude Code-inspired AI agent built on top of SwiftOpenAICLI, designed specifically for developers. It combines the power of the SwiftOpenAICLI agent with enhanced prompting, task management, and a developer-focused interface.

What is ISA?

ISA (Intelligent Software Assistant) is a specialized command-line AI agent that:

  • Defaults to interactive mode - Just type isa to start chatting
  • Uses GPT-5 by default - Optimized for complex reasoning and coding tasks
  • Includes Claude Code-inspired enhancements - Better task planning and execution
  • Supports context files - Use isa.md files for project-specific instructions
  • Features visual task management - Built-in todo lists with progress tracking
  • Provides a beautiful terminal UI - ASCII art branding and colored output

Installation

Using npm (Recommended)

The easiest way to install ISA is via npm:

npm install -g isa-swift

That's it! The isa command is now available globally.

Build from Source
  1. Clone the SwiftOpenAICLI repository (if not already done):
git clone https://github.com/jamesrochabrun/SwiftOpenAICLI.git
cd SwiftOpenAICLI
  1. Build ISA:
cd isa-cli
swift build -c release
  1. Install ISA globally:
# Copy the binary to /usr/local/bin
sudo cp .build/release/ISA /usr/local/bin/isa

# Or create a symlink (recommended for development)
sudo ln -sf $(pwd)/.build/release/ISA /usr/local/bin/isa
  1. Verify installation:
isa --version
# ISA version: 1.0.0
Alternative: Run without Installing

You can run ISA directly from the build directory:

cd isa-cli
swift run ISA
# Or for release performance:
swift build -c release && ./.build/release/ISA
Update ISA

To update ISA to the latest version:

cd SwiftOpenAICLI/isa-cli
git pull
swift build -c release
# If installed globally, copy the new binary
sudo cp .build/release/ISA /usr/local/bin/isa

Usage

Quick Start
# Start interactive mode (default behavior)
isa

# Send a single message
isa "Help me understand this codebase"

# Use with a specific model
isa "Refactor this function" --model gpt-4o

# Enable plan mode for complex tasks
isa "Build a REST API with authentication" --plan-mode

# Show todo list in real-time
isa "Implement user authentication" --show-todos

# With MCP servers
isa "Analyze my GitHub repos" --mcp-servers github --allowed-tools "mcp__github__*"
Interactive Mode

ISA defaults to interactive mode when run without arguments:

$ isa

╦╔═╗╔═╗
║╚═╗╠═╣
╩╚═╝╩ ╩
Intelligent Software Assistant

🚀 ISA Interactive Mode (gpt-5)
Type 'exit' to quit, 'clear' to reset, or use slash commands (/)

You: Hello! Can you help me understand this Python project?
Assistant: I'll help you understand your Python project! Let me start by exploring the structure...

You: /help
📚 Available Slash Commands
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
/help                Show available commands
/clear               Clear conversation history  
/models              List available models
/config              Show current configuration

You: exit
Goodbye!

Command-Line Options

ISA supports all SwiftOpenAICLI agent options plus additional features:

Basic Options
  • -m, --model <model> - AI model to use (default: gpt-5)
  • -i, --interactive - Force interactive mode (default when no message provided)
  • -v, --verbose - Enable verbose output
  • --help - Show help information
  • --version - Show ISA version
Advanced Options
  • --system <prompt> - Custom system prompt (enhanced with ISA context)
  • --temperature <0.0-2.0> - Control randomness (default: 1.0)
  • --max-tokens <number> - Maximum tokens to generate
  • --output-format <format> - Output format: plain, json, stream-json (default: plain)
  • --session-id <id> - Resume a previous conversation
  • --timeout <seconds> - Request timeout
  • --max-tool-calls <number> - Maximum tool calls allowed (default: 20)
ISA-Specific Options
  • --plan-mode - Show execution plan before running (great for complex tasks)
  • --show-todos - Display real-time todo list updates
  • --show-tool-events - Show detailed tool execution events
MCP and Tools Options
  • --mcp-servers <servers> - Enable MCP servers (comma-separated)
  • --allowed-tools <patterns> - Tool access patterns (e.g., "mcp__", "local__")
  • --local-tools-config <path> - Path to custom tools configuration

Slash Commands

ISA supports interactive slash commands for quick actions:

Available Commands
  • /help - Show all available commands
  • /help <command> - Get detailed help for a specific command
  • /clear - Clear conversation history and reset context
  • /models - List available AI models
  • /config - Display current configuration
  • /config <key> <value> - Set a configuration value
  • /config setup - Interactive provider configuration
  • /exit or /quit - Exit interactive mode
Using Slash Commands
You: /models
📊 Available Models
━━━━━━━━━━━━━━━━━━━━━━
• gpt-5 (default)
• gpt-5-mini
• gpt-5-nano
• gpt-4o
• gpt-4o-mini
• gpt-3.5-turbo

You: /config
⚙️  Current Configuration
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
• api-key:        sk-...configured
• default-model:  gpt-5
• temperature:    1.0
• max-tokens:     (default)
• max-tool-calls: 20
• animated-loading: true
• provider:       openai
• debug:          disabled

💡 Usage: /config <key> <value>
   Example: /config temperature 0.7

You: /config setup
🚀 Configuration Setup
━━━━━━━━━━━━━━━━━━━━━

Available providers:
1. OpenAI (GPT-4, GPT-3.5)
2. xAI (Grok)
3. Groq (Fast inference)
4. Anthropic (Claude)
5. DeepSeek
6. OpenRouter
7. Custom Provider

Select a provider (1-7): 2
✓ Selected: xAI (Grok)

Enter your API key for xAI: xai-...
Use default model 'grok-code-fast-1'? (Y/n): y
Enable debug mode? (y/N): n

✅ Configuration complete!
Your settings updated in ISA session.

Context Files (isa.md)

ISA can read project-specific context from isa.md files in your project root:

# Project: My React App

## Conventions
- Use TypeScript with strict mode
- Follow ESLint and Prettier rules
- Components go in src/components/
- Use React hooks, no class components

## Architecture
- State management: Redux Toolkit
- Styling: Tailwind CSS
- Testing: Jest and React Testing Library

## Important Files
- src/App.tsx - Main application
- src/store/ - Redux store configuration
- src/api/ - API client code

When ISA finds an isa.md file, it automatically incorporates this context into its understanding of your project.

Real-World Examples

Code Refactoring Session
$ isa
You: I need to refactor the user authentication module to use JWT tokens
Assistant: I'll help you refactor the authentication module to use JWT tokens. Let me start by examining your current authentication setup.

[Analyzing project structure...]
✓ Found authentication module in src/auth/
✓ Current implementation uses session-based auth
✓ Identified 5 files that need updates

📋 Todo List:
1. ✅ Analyze current authentication implementation
2. 🔄 Install JWT dependencies
3. ⏳ Create JWT service module
4. ⏳ Update login endpoint
5. ⏳ Update middleware for token validation

Let me start implementing these changes...
Project Analysis with Plan Mode
$ isa "Analyze this React project and suggest improvements" --plan-mode

📝 Execution Plan:
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
1. Scan project structure and dependencies
2. Analyze component architecture
3. Check for performance issues
4. Review code patterns and conventions
5. Generate improvement recommendations

Proceed with this plan? (y/n): y

[Executing plan...]
Using with MCP Servers
# GitHub integration
$ isa "Review my recent PRs and suggest which ones need attention" \
  --mcp-servers github \
  --allowed-tools "mcp__github__*"

# File system operations
$ isa "Help me organize this project's file structure" \
  --mcp-servers filesystem \
  --allowed-tools "mcp__filesystem__*"

# Multiple servers
$ isa --interactive \
  --mcp-servers github,filesystem,postgres \
  --allowed-tools "mcp__*"
You: Check if the database schema matches the TypeScript types in src/types/
Complex Task with Todo Tracking
$ isa "Create a new REST API endpoint for user profiles" --show-todos

📋 Task Management:
━━━━━━━━━━━━━━━━━━━━━
[ ] Create route handler
[ ] Add validation schema  
[ ] Write database queries
[ ] Add tests
[ ] Update API documentation

[Working on: Create route handler]
✓ Created src/routes/profiles.ts
[Working on: Add validation schema]
✓ Added Zod schema for profile validation
...

Advanced Features

Session Persistence

ISA can maintain conversation context across sessions:

# Start a new session with a specific ID
$ isa --session-id project-refactor
You: Let's work on refactoring the database layer
Assistant: I'll help you refactor the database layer. Let me examine your current setup...

# Later, resume the same session
$ isa --session-id project-refactor  
You: What were we working on?
Assistant: We were refactoring the database layer. Last time, we identified...
Custom Tools Configuration

Create custom tools for ISA using JSON configuration:

// isa-tools.json
{
  "tools": [
    {
      "name": "run_tests",
      "description": "Run project tests",
      "command": "npm test -- {{pattern}}",
      "parameters": {
        "type": "object",
        "properties": {
          "pattern": {
            "type": "string",
            "description": "Test file pattern"
          }
        }
      }
    }
  ]
}

Use custom tools:

$ isa "Run tests for the auth module" --local-tools-config ./isa-tools.json
Integration with SwiftOpenAICLI

ISA is fully integrated with SwiftOpenAICLI and shares:

  • Configuration files (~/.swiftopenai/config.json)
  • MCP server configurations
  • API keys and settings
  • Session management

You can seamlessly switch between swiftopenai agent and isa based on your needs:

  • Use isa for development tasks with enhanced UI and defaults
  • Use swiftopenai agent for general-purpose AI agent tasks

Tips and Best Practices

  1. Use isa.md files - Document your project conventions for better assistance
  2. Enable plan mode - For complex tasks, use --plan-mode to review before execution
  3. Track progress - Use --show-todos to visualize task completion
  4. Leverage sessions - Use --session-id for long-running projects
  5. Combine with MCP - Connect to GitHub, databases, and more for powerful workflows
  6. Custom shortcuts - Create shell aliases for common ISA commands:
    alias isa-plan='isa --plan-mode --show-todos'
    alias isa-debug='isa --verbose --show-tool-events'

MCP (Model Context Protocol) Integration 🔌

SwiftOpenAI-CLI supports the Model Context Protocol (MCP), allowing you to connect to external services and tools through MCP servers. This feature is fully compatible with the Claude Code SDK specification, enabling seamless integration with a growing ecosystem of MCP-compatible tools.

What is MCP?

MCP is an open protocol developed by Anthropic that enables AI assistants to securely connect to external data sources and tools. Instead of building custom integrations for each service, MCP provides a standardized way to:

  • Access external APIs - GitHub, Slack, Google Drive, databases, and more
  • Perform system operations - File system access, shell commands, git operations
  • Query data sources - PostgreSQL, SQLite, REST APIs, GraphQL endpoints
  • Extend capabilities - Add any custom tool without modifying the CLI

Quick Start

# 1. Add the GitHub MCP server to your configuration
swiftopenai config mcp add github npx \
  --args "@modelcontextprotocol/server-github" \
  --env "GITHUB_PERSONAL_ACCESS_TOKEN=your-token" \
  --enable

# 2. Use it with the agent command (--allowed-tools is REQUIRED for MCP tools)
swiftopenai agent "List my recent pull requests" \
  --mcp-servers github \
  --allowed-tools "mcp__github__*"

# 3. Interactive mode for continuous conversations
swiftopenai agent \
  --interactive \
  --mcp-servers github \
  --allowed-tools "mcp__*"
# 🚀 MCP servers initialized once for this session (optimized!)
# You: Create an issue about the bug we discussed
# You: Show me all open issues
# You: Close issue #123

⚠️ Important: Following the Claude Code SDK security model, MCP tools must be explicitly allowed using the --allowed-tools flag. Without this flag, MCP servers will connect but their tools won't be available to the agent.

Available MCP Servers

Popular MCP servers you can use immediately:

  • GitHub (@modelcontextprotocol/server-github) - Manage repos, issues, PRs, releases
  • Filesystem (@modelcontextprotocol/server-filesystem) - Read/write local files with permissions
  • PostgreSQL (@modelcontextprotocol/server-postgres) - Query and manage PostgreSQL databases
  • Slack (@modelcontextprotocol/server-slack) - Send messages, read channels, manage workspace
  • Google Drive (@modelcontextprotocol/server-gdrive) - Access and manage Drive files
  • Git (@modelcontextprotocol/server-git) - Version control operations
  • Puppeteer (@modelcontextprotocol/server-puppeteer) - Web scraping and browser automation
  • Airbnb (@openbnb/mcp-server-airbnb) - Search listings, get details, read reviews
  • Zapier - Connect to 7,000+ apps including Gmail, Slack, Notion, and more

Find more at MCP Servers Repository

Local Tools Registry 🛠️

In addition to MCP servers, SwiftOpenAI-CLI supports custom local tools defined via JSON configuration files. This allows you to extend the agent's capabilities with your own shell commands and scripts without modifying the CLI.

What are Local Tools?

Local tools are custom commands that the AI agent can execute on your behalf:

  • Shell commands - Run any command-line tool with parameter interpolation
  • Custom scripts - Execute Python, Bash, or any other scripts
  • System utilities - Integrate with local development tools
  • Workflow automation - Chain commands for complex operations

Quick Start

  1. Create a tools configuration file (tools.json):
{
  "tools": [
    {
      "name": "search_code",
      "description": "Search for code patterns in the project",
      "command": "grep -r '{{pattern}}' {{directory}}",
      "parameters": {
        "type": "object",
        "properties": {
          "pattern": {
            "type": "string",
            "description": "The pattern to search for"
          },
          "directory": {
            "type": "string",
            "description": "Directory to search in"
          }
        }
      }
    }
  ]
}
  1. Use with the agent command:
# Single command
swiftopenai agent "Search for TODO comments in src folder" \
  --local-tools-config ./tools.json

# Interactive mode with local tools
swiftopenai agent --interactive \
  --local-tools-config ~/.my-tools.json

# Combine with MCP servers
swiftopenai agent "Search code and create GitHub issue" \
  --local-tools-config ./tools.json \
  --mcp-servers github \
  --allowed-tools "local__*,mcp__github__*"

Configuration Format

Each tool in the configuration requires:

  • name - Tool identifier (automatically prefixed with local__ internally)
  • description - What the tool does (helps AI decide when to use it)
  • command or script - The command to execute or script path
  • parameters - JSON Schema defining the tool's parameters
  • working_directory (optional) - Directory to execute the command in
Command Interpolation

Use {{parameter}} syntax for parameter substitution:

{
  "command": "curl -X {{method}} '{{url}}' -H 'Content-Type: {{content_type}}'"
}

Parameters are automatically escaped for shell safety.

Script Execution

For complex logic, use scripts instead of commands:

{
  "name": "analyze_logs",
  "description": "Analyze application logs",
  "script": "/usr/local/bin/analyze-logs.py",
  "parameters": {
    "type": "object",
    "properties": {
      "log_file": {"type": "string"},
      "date_range": {"type": "string"}
    }
  }
}

Scripts receive parameters as JSON in the first argument.

Complete Example

{
  "tools": [
    {
      "name": "git_status",
      "description": "Get git repository status",
      "command": "git status --short",
      "parameters": {
        "type": "object",
        "properties": {}
      }
    },
    {
      "name": "run_tests",
      "description": "Run tests with coverage",
      "command": "npm test -- {{test_pattern}} --coverage",
      "working_directory": "./project",
      "parameters": {
        "type": "object",
        "properties": {
          "test_pattern": {
            "type": "string",
            "description": "Test file pattern"
          }
        }
      }
    },
    {
      "name": "docker_ps",
      "description": "List running Docker containers",
      "command": "docker ps --format 'table {{.Names}}\t{{.Status}}\t{{.Ports}}'",
      "parameters": {
        "type": "object",
        "properties": {}
      }
    },
    {
      "name": "find_large_files",
      "description": "Find files larger than specified size",
      "command": "find {{path}} -type f -size +{{size}} -exec ls -lh {} \\;",
      "parameters": {
        "type": "object",
        "properties": {
          "path": {
            "type": "string",
            "description": "Directory to search"
          },
          "size": {
            "type": "string",
            "description": "Minimum file size (e.g., '10M', '1G')"
          }
        }
      }
    }
  ]
}

Usage Patterns

Development Workflow
# Set up your development tools
swiftopenai agent --interactive \
  --local-tools-config ~/.dev-tools.json \
  --model gpt-4o

# Available tools: git_status, run_tests, lint_code, build_project
You: Check git status and run tests for the auth module
System Administration
# System monitoring and maintenance
swiftopenai agent "Check system resources and Docker containers" \
  --local-tools-config ~/.sys-tools.json
Mixed Tool Usage
# Combine local tools with MCP servers
swiftopenai agent --interactive \
  --local-tools-config ./project-tools.json \
  --mcp-servers github,slack \
  --allowed-tools "*"  # Allow all tools

You: Run tests, and if they pass, create a GitHub PR

Security Considerations

  • Parameter Escaping: All parameters are automatically escaped to prevent shell injection
  • Working Directory: Tools can be restricted to specific directories
  • No Prefix Required: Users never need to type local__ prefix - it's added automatically
  • Explicit Permissions: Like MCP tools, use --allowed-tools to control which tools are available

Tool Selection

Tools can be selected using patterns:

  • search_code - Use tool by name (prefix added automatically)
  • local__* - All local tools
  • local__git_* - All local tools starting with "git_"
  • * - All tools (both local and MCP)

Best Practices

  1. Descriptive Names: Use clear, action-oriented names
  2. Detailed Descriptions: Help the AI understand when to use each tool
  3. Parameter Validation: Define required parameters in the schema
  4. Error Handling: Tools should return meaningful error messages
  5. Composability: Design tools that work well together

MCP Configuration

Method 1: CLI Commands (Recommended)
# Add a new MCP server
swiftopenai config mcp add <name> <command> --args <arguments> --env <KEY=VALUE> --enable

# Examples
swiftopenai config mcp add github npx \
  --args "@modelcontextprotocol/server-github" \
  --env "GITHUB_PERSONAL_ACCESS_TOKEN=ghp_..." \
  --enable

swiftopenai config mcp add postgres npx \
  --args "@modelcontextprotocol/server-postgres" \
  --env "DATABASE_URL=postgresql://user:pass@localhost/db" \
  --enable

# Manage servers
swiftopenai config mcp list                    # List all configured servers
swiftopenai config mcp enable github           # Enable a server
swiftopenai config mcp disable github          # Disable a server  
swiftopenai config mcp remove github           # Remove a server
Method 2: Direct Configuration File

Edit ~/.swiftopenai/config.json:

Object Format (Claude Code SDK compatible):

{
  "apiKey": "sk-...",
  "defaultModel": "gpt-4o",
  "mcpServers": {
    "github": {
      "command": "npx",
      "args": ["@modelcontextprotocol/server-github"],
      "env": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_..."
      },
      "enabled": true
    },
    "postgres": {
      "command": "npx",
      "args": ["@modelcontextprotocol/server-postgres"],
      "env": {
        "DATABASE_URL": "postgresql://localhost/mydb"
      },
      "enabled": true
    }
  }
}

Array Format (Legacy, still supported):

{
  "mcpServers": [
    {
      "name": "github",
      "command": "npx",
      "args": ["@modelcontextprotocol/server-github"],
      "environment": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_..."
      },
      "enabled": true
    }
  ]
}

Usage Examples

GitHub Workflow
# Repository management (--allowed-tools is required for MCP tools)
swiftopenai agent "Create a new repo called my-project with a README" \
  --mcp-servers github \
  --allowed-tools "mcp__github__*"

swiftopenai agent "List all issues in repo jamesrochabrun/SwiftOpenAI" \
  --mcp-servers github \
  --allowed-tools "mcp__github__*"

swiftopenai agent "Create a pull request from feature branch to main" \
  --mcp-servers github \
  --allowed-tools "mcp__github__*"

# Interactive development session
swiftopenai agent \
  --interactive \
  --mcp-servers github \
  --allowed-tools "mcp__github__*"
# You: Show me all my starred repositories
# You: Create an issue in the first one about updating dependencies
# You: Add a comment to issue #42 with the solution we discussed
Database Operations
# Add PostgreSQL server
swiftopenai config mcp add postgres npx \
  --args "@modelcontextprotocol/server-postgres" \
  --env "DATABASE_URL=postgresql://user:pass@localhost/myapp" \
  --enable

# Query database (--allowed-tools required)
swiftopenai agent "Show me all users created in the last week" \
  --mcp-servers postgres \
  --allowed-tools "mcp__postgres__*"

swiftopenai agent "What's the total revenue this month?" \
  --mcp-servers postgres \
  --allowed-tools "mcp__postgres__*"

# Interactive data analysis
swiftopenai agent \
  --interactive \
  --mcp-servers postgres \
  --allowed-tools "mcp__postgres__*"
# You: List all tables in the database
# You: Show me the schema for the orders table
# You: Calculate the average order value for each month
File System Operations
# Add filesystem server with specific permissions
swiftopenai config mcp add fs npx \
  --args "@modelcontextprotocol/server-filesystem,/Users/me/projects" \
  --enable

# File operations (--allowed-tools required)
swiftopenai agent "List all Python files in the current directory" \
  --mcp-servers fs \
  --allowed-tools "mcp__fs__*"

swiftopenai agent "Read the package.json and summarize dependencies" \
  --mcp-servers fs \
  --allowed-tools "mcp__fs__*"

swiftopenai agent "Create a new file called notes.md with our discussion" \
  --mcp-servers fs \
  --allowed-tools "mcp__fs__*"
Zapier Integration (Gmail, Slack, and 7,000+ Apps)

Zapier MCP enables AI integration with thousands of apps through Zapier's platform. This is particularly powerful for Gmail automation, Slack messaging, and connecting to services without direct MCP support.

Quick Start - Try it in 2 minutes:

  1. Get your Zapier MCP URL from zapier.com/app/ai-actions
  2. Add the Zapier server:
swiftopenai config mcp add-http zapier \
  "https://mcp.zapier.com/api/mcp/s/YOUR_ZAPIER_TOKEN/mcp" \
  --enable
  1. Start using Gmail immediately:
# Find emails
swiftopenai agent "Find my latest unread email" \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_find_email"

# Interactive mode for email management
swiftopenai agent \
  --interactive \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_*"

Gmail Examples:

# Find emails
swiftopenai agent "Find my latest email" \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_find_email" \
  --model gpt-4o-mini

# Send an email
swiftopenai agent "Send an email to [email protected] saying 'Meeting confirmed for 3pm tomorrow'" \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_send_email"

# Create a draft
swiftopenai agent "Create a draft email to [email protected] with subject 'Project Update' and body 'The project is on track for Q1 delivery'" \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_create_draft"

# Reply to emails
swiftopenai agent "Find the email from John about the budget and reply saying 'Approved, please proceed'" \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_find_email,mcp__zapier__gmail_reply_to_email"

# Manage labels
swiftopenai agent "Create a new Gmail label called 'Important-2025'" \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_create_label"

swiftopenai agent "Find all emails from Amazon and add the 'Receipts' label" \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_find_email,mcp__zapier__gmail_add_label_to_email"

# Archive and delete
swiftopenai agent "Archive all emails older than 30 days with the label 'Processed'" \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_find_email,mcp__zapier__gmail_archive_email"

Interactive Gmail Session:

swiftopenai agent \
  --interactive \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_*" \
  --model gpt-4o-mini

# You: Find my unread emails from today
# You: Summarize the important ones
# You: Create a draft reply to the most urgent one
# You: Add the "ToDo" label to emails that need follow-up

Real-World Interactive Examples:

# Email Triage Session
swiftopenai agent \
  --interactive \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_*"

# You: Find all unread emails from the last 3 days
# Assistant: I found 24 unread emails from the last 3 days...
# You: Which ones are from my manager or marked as high priority?
# Assistant: I found 3 emails from your manager...
# You: Create draft replies for each of them acknowledging receipt
# Assistant: I've created 3 draft replies...
# You: Archive all newsletter emails
# Assistant: I've archived 8 newsletter emails...

# Email Cleanup Session
swiftopenai agent \
  --interactive \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__*" \
  --model gpt-4o-mini

# You: How many emails do I have with the label "To Review"?
# Assistant: You have 47 emails labeled "To Review"...
# You: Show me the oldest 5
# Assistant: Here are the 5 oldest emails...
# You: Delete anything older than 6 months
# Assistant: I've moved 23 emails older than 6 months to trash...
# You: Create a new label called "Archive-2024" and apply it to all reviewed emails from 2024
# Assistant: Created label "Archive-2024" and applied it to 15 emails...

# Daily Email Workflow
swiftopenai agent \
  --interactive \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_*"

# You: Good morning! What important emails came in overnight?
# Assistant: Good morning! You received 3 important emails overnight...
# You: Send a quick reply to Sarah saying I'll review the document by noon
# Assistant: I've sent the reply to Sarah confirming you'll review by noon...
# You: Create a calendar reminder for that review at 11:30am
# You: Find all emails about the Q1 planning meeting
# Assistant: I found 7 emails about Q1 planning...
# You: Create a summary document of the key points discussed
# Assistant: Here's a summary of the Q1 planning discussions...

Available Zapier Gmail Tools:

  • mcp__zapier__gmail_find_email - Search and find emails
  • mcp__zapier__gmail_send_email - Send new emails
  • mcp__zapier__gmail_create_draft - Create draft emails
  • mcp__zapier__gmail_create_draft_reply - Create draft replies
  • mcp__zapier__gmail_reply_to_email - Send replies
  • mcp__zapier__gmail_add_label_to_email - Add labels
  • mcp__zapier__gmail_remove_label_from_email - Remove labels
  • mcp__zapier__gmail_create_label - Create new labels
  • mcp__zapier__gmail_archive_email - Archive emails
  • mcp__zapier__gmail_delete_email - Move to trash
  • mcp__zapier__add_tools - Add new Zapier actions
  • mcp__zapier__edit_tools - Edit existing actions

Advanced Zapier Workflows:

# Email management workflow
swiftopenai agent \
  --interactive \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__*"
# You: Find all newsletter emails from this week
# You: Create a summary of the key points
# You: Draft an email to my team with the summary
# You: Archive the original newsletters

# Customer support automation
swiftopenai agent "Find all customer support emails from today, categorize them by urgency, and create draft responses for each" \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_*"

# Email analytics
swiftopenai agent "Analyze my email patterns: who do I email most, what times do I send emails, and what are common subjects" \
  --mcp-servers zapier \
  --allowed-tools "mcp__zapier__gmail_find_email" \
  --output-format json

Tips for Zapier MCP:

  • Zapier requires specific headers that our custom transport handles automatically
  • The connection may take a moment to initialize (HTTP 202 responses are normal)
  • Use --model gpt-4o-mini for cost-effective email operations
  • Always use --allowed-tools to explicitly grant permissions
  • Gmail queries support advanced search syntax (e.g., "from:boss subject:urgent")
Multi-Server Usage
# Use multiple MCP servers together (--allowed-tools required)
swiftopenai agent "Read the README.md file and create a GitHub issue about missing docs" \
  --mcp-servers fs,github \
  --allowed-tools "mcp__fs__*,mcp__github__*"

# Interactive with multiple servers
swiftopenai agent --interactive --mcp-servers github,postgres,fs --allowed-tools "mcp__*"
# You: Read the database schema from schema.sql
# You: Check if there are any GitHub issues about database migrations
# You: Create a migration script based on the schema changes

Advanced Features

Tool Naming Convention

MCP tools follow the naming pattern: mcp__serverName__toolName

# Tools are automatically namespaced
mcp__github__create_issue
mcp__github__list_repos
mcp__postgres__query
mcp__fs__read_file
Pattern Matching with --allowed-tools

Control which tools the agent can use with glob patterns:

# Allow only GitHub tools
swiftopenai agent "Help me manage my repos" --allowed-tools "mcp__github__*"

# Allow specific tools from multiple servers
swiftopenai agent "Analyze data" --allowed-tools "mcp__postgres__query,mcp__fs__read*"

# Allow all MCP tools but no built-in tools
swiftopenai agent "Do something" --allowed-tools "mcp__*"

# Mix different MCP server tools
swiftopenai agent "Query and store data" --allowed-tools "mcp__postgres__*,mcp__fs__*"
Auto-Installation

NPX packages are automatically installed with the -y flag - no manual installation needed:

# Just add and use - auto-installs on first run
swiftopenai config mcp add newserver npx --args "@org/mcp-server" --enable
swiftopenai agent "Use the new server" --mcp-servers newserver
# Automatically runs: npx -y @org/mcp-server
Performance Optimization

In interactive mode, MCP servers are initialized once and reused for the entire session:

swiftopenai agent --interactive --mcp-servers github,postgres
# 🚀 MCP servers initialized once for this session
# Fast responses - no reconnection between messages!

Non-interactive mode creates fresh connections for each command (stateless execution).

Real-World Scenarios

Development Workflow
# Morning standup prep
swiftopenai agent \
  --interactive \
  --mcp-servers github,postgres \
  --allowed-tools "mcp__*"
# You: Show me all PRs assigned to me
# You: Check if the database has the migrations from PR #123
# You: List all issues labeled 'bug' created yesterday
# You: Generate a summary for the standup
Content Management
# Blog post workflow
swiftopenai agent \
  --interactive \
  --mcp-servers fs,github \
  --allowed-tools "mcp__*"
# You: Read all markdown files in the blog directory
# You: Create a new post about MCP integration
# You: Generate a table of contents
# You: Create a PR with the new post
Data Analysis
# Sales analysis
swiftopenai agent \
  --interactive \
  --mcp-servers postgres \
  --allowed-tools "mcp__postgres__*"
# You: Show me total sales by region this quarter
# You: Calculate the month-over-month growth rate
# You: Which products have the highest margin?
# You: Export the top 10 products to a report

Troubleshooting

NPX not found:

# Find npx location
which npx
# Usually: /usr/local/bin/npx or ~/.nvm/versions/node/vX.X.X/bin/npx

# The CLI automatically resolves npx through PATH
# If issues persist, use the full path in configuration

Authentication errors:

# Check environment variables are set correctly
swiftopenai config mcp list
# Verify the env vars show correctly

# Test with a simple command
swiftopenai agent "Test connection" \
  --mcp-servers github \
  --show-mcp-status

Tool not found:

# List all available tools
swiftopenai agent "What tools are available?" \
  --mcp-servers github \
  --show-mcp-status

# Tools are named: mcp__serverName__toolName
# Use --allowed-tools with correct pattern
swiftopenai agent "Do something" --allowed-tools "mcp__github__*"

Server not connecting:

# Enable verbose output
swiftopenai agent "Test" \
  --mcp-servers myserver \
  --show-mcp-status

# Check server is enabled
swiftopenai config mcp list

# Try re-adding with correct arguments
swiftopenai config mcp remove myserver
swiftopenai config mcp add myserver npx --args "@correct/package" --enable

Image Generation

# Generate an image
swiftopenai image "A sunset over mountains in watercolor style"

# With options
swiftopenai image "A futuristic city" --model dall-e-3 --size 1024x1024 --quality hd

# Save to directory
swiftopenai image "A cat" --output ./images

List Models

# List all models
swiftopenai models

# Filter models
swiftopenai models --filter gpt

# Detailed view
swiftopenai models --detailed

Embeddings

# Generate embeddings
swiftopenai embed "Hello world"

# Save to file
swiftopenai embed "Your text here" --output embeddings.json

# Show statistics
swiftopenai embed "Text to embed" --stats

Configuration

# Set API key
swiftopenai config set api-key sk-...

# Get configuration value
swiftopenai config get default-model

# List all settings
swiftopenai config list

GPT-5 Models

SwiftOpenAI-CLI now supports GPT-5 models with advanced reasoning and verbosity controls. The CLI automatically normalizes model names for convenience:

Supported Model Names

  • gpt5 or gpt-5 - Complex reasoning, broad world knowledge, and code-heavy or multi-step agentic tasks
  • gpt5mini or gpt-5-mini - Cost-optimized reasoning and chat; balances speed, cost, and capability
  • gpt5nano or gpt-5-nano - High-throughput tasks, especially simple instruction-following or classification
# Fast coding assistance with minimal reasoning
swiftopenai "Write a Python function to sort a list" --model gpt5 --reasoning minimal --verbose low

# Detailed explanations with thorough reasoning
swiftopenai "Explain quantum entanglement" --model gpt-5 --reasoning high --verbose high

# Cost-optimized with balanced settings
swiftopenai "Help me debug this code" --model gpt5mini --reasoning medium --verbose medium

# High-throughput simple tasks
swiftopenai "Classify this text as positive or negative" --model gpt5nano --reasoning minimal --verbose low

# Interactive mode with GPT-5 Mini
swiftopenai chat --interactive --model gpt-5-mini --reasoning minimal

Verbosity Levels

  • low - Concise responses with minimal detail
  • medium - Balanced responses (default)
  • high - Detailed, comprehensive responses

Reasoning Effort

  • minimal - Very few reasoning tokens for fastest response (great for coding and instructions)
  • low - Light reasoning for simple tasks
  • medium - Balanced reasoning (default)
  • high - Thorough reasoning for complex problems

Notes:

  • The --verbose and --reasoning parameters only apply to GPT-5 family models
  • Model names are case-insensitive and hyphens are optional (e.g., gpt5, GPT5, gpt-5 all work)
  • The CLI automatically normalizes model names to the correct API format

Using Alternative Providers

SwiftOpenAI-CLI supports any OpenAI-compatible API providers. Built with SwiftOpenAI v4.3.2, it can connect to providers like Grok, Groq, OpenRouter, DeepSeek, and more. Configure the CLI to use these providers:

Note: When using alternative providers, use the --model flag with the provider's specific model names. For example:

  • OpenRouter: anthropic/claude-3.5-sonnet, openai/gpt-4, google/gemini-pro
  • DeepSeek: deepseek-reasoner, deepseek-chat
  • Groq: llama2-70b-4096, mixtral-8x7b-32768

Grok (xAI)

# Configure for Grok
swiftopenai config set provider grok
swiftopenai config set base-url https://api.x.ai
swiftopenai config set api-key your-grok-api-key

# Use Grok models
swiftopenai "What's the latest in AI?" --model grok-beta

Groq

# Configure for Groq
swiftopenai config set provider groq
swiftopenai config set base-url https://api.groq.com
swiftopenai config set api-key your-groq-api-key

# Use Groq models
swiftopenai "Explain quantum computing" --model llama2-70b-4096

Local Models (Ollama)

# Configure for Ollama
swiftopenai config set provider ollama
swiftopenai config set base-url http://localhost:11434
swiftopenai config set api-key optional-or-empty

# Use local models
swiftopenai "Hello!" --model llama2

OpenRouter (Access 300+ Models)

# Configure for OpenRouter
swiftopenai config set provider openrouter
swiftopenai config set base-url https://openrouter.ai/api
swiftopenai config set api-key your-openrouter-api-key

# Use any of the 300+ available models
swiftopenai "Explain quantum computing" --model anthropic/claude-3.5-sonnet
swiftopenai "Write a haiku" --model openai/gpt-4-turbo
swiftopenai "Solve this math problem" --model google/gemini-pro

DeepSeek (Advanced Reasoning Models)

# Configure for DeepSeek
swiftopenai config set provider deepseek
swiftopenai config set base-url https://api.deepseek.com
swiftopenai config set api-key your-deepseek-api-key

# Use DeepSeek models
swiftopenai "What is the Manhattan project?" --model deepseek-reasoner
swiftopenai "Explain step by step how to solve x^2 + 5x + 6 = 0" --model deepseek-chat

Reset to OpenAI

# Clear provider configuration to use OpenAI
swiftopenai config set provider ""
swiftopenai config set base-url ""

Quick Provider Switching

You can create shell aliases for quick provider switching:

# Add to your ~/.zshrc or ~/.bashrc
alias ai-openai='swiftopenai config set provider "" && swiftopenai config set base-url ""'
alias ai-grok='swiftopenai config set provider grok && swiftopenai config set base-url https://api.x.ai'
alias ai-deepseek='swiftopenai config set provider deepseek && swiftopenai config set base-url https://api.deepseek.com'
alias ai-openrouter='swiftopenai config set provider openrouter && swiftopenai config set base-url https://openrouter.ai/api'

# Then switch providers easily
ai-deepseek
swiftopenai "What is quantum entanglement?" --model deepseek-reasoner

ai-openrouter
swiftopenai "Write a poem" --model anthropic/claude-3-haiku

Debug Mode

Enable debug mode to see detailed API requests and responses:

# Enable debug mode
swiftopenai config set debug true

# Disable debug mode
swiftopenai config set debug false

When debug mode is enabled and the CLI is built in debug configuration, you'll see:

  • Full curl commands for API requests
  • HTTP response headers and status codes
  • Raw JSON responses from the API

Note: Debug output requires building the CLI in debug mode (swift build) rather than release mode.

Command Options

Global Options

  • --help - Show help information
  • --version - Show version

Chat Options

  • -m, --model - Model to use (default: gpt-4o)
  • -i, --interactive - Interactive chat mode
  • -p, --plain - Plain output without formatting (useful for scripts)
  • --system - System prompt
  • --temperature - Temperature (0.0-2.0)
  • --max-tokens - Maximum tokens to generate
  • --no-stream - Disable streaming response
  • --verbose - Verbosity level for GPT-5 models (low, medium, high) - default: medium
  • --reasoning - Reasoning effort for GPT-5 models (minimal, low, medium, high) - default: medium

Image Options

  • -n, --number - Number of images (1-10, dall-e-3 only supports 1)
  • --size - Image size:
    • dall-e-2: 256x256, 512x512, 1024x1024
    • dall-e-3: 1024x1024, 1792x1024, 1024x1792
  • --model - Model (dall-e-2, dall-e-3)
  • --quality - Quality (standard, hd - dall-e-3 only)
  • --output - Output directory for saving images

Examples

GPT-5 Examples

# Using different GPT-5 models (with or without hyphens)
$ swiftopenai "Generate a sorting algorithm" --model gpt5 --reasoning high
$ swiftopenai "Summarize this text" --model gpt5mini --verbose low
$ swiftopenai "Yes or No?" --model gpt5nano --reasoning minimal

# The CLI normalizes these model names automatically:
# gpt5 → gpt-5
# gpt5mini → gpt-5-mini  
# gpt5nano → gpt-5-nano

Interactive Chat Session

$ swiftopenai chat -i
🤖 OpenAI Chat (gpt-4o)
Type 'exit' to quit, 'clear' to clear history

You: Hello! Can you help me with Swift?
Assistant: Of course! I'd be happy to help you with Swift...

You: exit
Goodbye!

Generate Multiple Images

$ swiftopenai image "A serene landscape" -n 3 --output ./landscapes
Generating image with prompt: "A serene landscape"
Model: dall-e-3, Size: 1024x1024, Quality: standard

Generated 3 image(s):
1. URL: https://...
   Saved to: ./landscapes/dalle_1_1234567890.png
2. URL: https://...
   Saved to: ./landscapes/dalle_2_1234567890.png
3. URL: https://...
   Saved to: ./landscapes/dalle_3_1234567890.png

Using Plain Output in Scripts

# Get a plain response for use in scripts
$ answer=$(swiftopenai -p "What is 2+2?")
$ echo "The answer is: $answer"
The answer is: 4

# Compare with formatted output
$ swiftopenai "What is 2+2?"
Assistant: 4

Using it with Claude Code

claudeCodeGPT41chat.mp4

Requirements

  • macOS 13.0+
  • Swift 5.9+

License

MIT License

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Acknowledgments

Built with:

About

The CLI for SwiftOpenAI

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •