Skip to content

Bestello/llam

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLAM 🤖

A Python-based multiplatform chat application for local AI model inference. Built with Flet for cross-platform compatibility.

✨ Features

  • 🔄 Full Ollama integration - Compatible with all Ollama setups
  • 🔀 Model switching - Switch between different LLMs seamlessly
  • 🌐 Network flexibility - Configure custom local IP addresses
  • 📝 Rich markdown support - Displays tables, code blocks, and formatted text
  • 🖥️ Cross-platform - Runs on Windows, macOS, and Linux

🚀 Quick Start

Prerequisites

  • Python 3.8+
  • Ollama installed and running

Installation

  1. Clone the repository

    git clone https://github.com/Bestello/llam.git
  2. Navigate to the project directory

    cd llam
  3. Install dependencies

    pip install -r requirements.txt
  4. Run the application

    flet run main.py

📱 Building for Mobile (Android APK)

To create an Android APK using Flet:

  1. Install Flet build tools

    pip install flet[build]
  2. Build the APK

    flet build apk
  3. Build with custom options (optional)

    flet build apk --project "LLAM Chat" --description "Local AI Chat App" --copyright "Your Name"

The APK will be generated in the build/apk directory.

Note: You'll need Android SDK and Java Development Kit (JDK) installed for APK building. Check Flet's documentation for detailed setup instructions.

📋 Roadmap

🎯 Planned Features

  • 🖼️ Image support for multimodal models
  • 🎤 Text-to-Speech (TTS) integration
  • 🔊 Speech-to-Text (STT) capabilities
  • 📚 Chat history persistence
  • ℹ️ About section

🐛 Known Issues

  • Codebase needs refactoring
  • Improved error logging
  • Better connection error handling
  • Model loading optimization
  • Chat context preservation

📸 Screenshots

🤝 Contributing

Contributions are welcome! This project is actively being improved. Feel free to:

  • 🐛 Report bugs
  • 💡 Suggest features
  • 🔧 Submit pull requests

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments


Star this repo if you find it useful!

About

A simple genAi multiplatform chat app made completly in python with the help of flet

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages