Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Created by
brew bumpCreated with
brew bump-formula-pr.release notes
What’s New in Detail
🚀 New Backends and Model Support
We've significantly expanded the range of models you can run with LocalAI!
mlx-audiobackend. Example configuration:diffusersbackend, supporting both I2V and T2V. Example configuration:✨ WebUI Improvements
We've added several new features to make using LocalAI even easier:
🚀 Performance & Architecture Improvements
🛠️ Simplified Management – Introducing the LocalAI Launcher (Alpha)
We're excited to introduce the first version of the LocalAI Launcher! This application simplifies:
Please note: The launcher is in Alpha and may have bugs. The macOS build requires workarounds to run due to binaries not yet signed, and specific steps for running it are needed: https://discussions.apple.com/thread/253714860?answerId=257037956022#257037956022.
✅ Bug Fixes & Stability Improvements
libomp.soissue on macOS Docker containers.libutf8libraries.Additional Improvements
LOCALAI_BACKENDS_SYSTEM_PATHor via command-line arguments) defaulting to/usr/share/localai/backends. This allows specifying a read-only directory for backends, useful for package management and system-wide installations.ref_imagesoversrcfor more robust loading behavior.🚨 Important Notes
The Complete Local Stack for Privacy-First AI
LocalAI
The free, Open Source OpenAI alternative. Acts as a drop-in replacement REST API compatible with OpenAI specifications for local AI inferencing. No GPU required.
Link: https://github.com/mudler/LocalAI
LocalAGI
A powerful Local AI agent management platform. Serves as a drop-in replacement for OpenAI's Responses API, supercharged with advanced agentic capabilities and a no-code UI.
Link: https://github.com/mudler/LocalAGI
LocalRecall
A RESTful API and knowledge base management system providing persistent memory and storage capabilities for AI agents. Designed to work alongside LocalAI and LocalAGI.
Link: https://github.com/mudler/LocalRecall
Thank you! ❤️
A massive THANK YOU to our incredible community and our sponsors! LocalAI has over 35.000 stars, and LocalAGI has already rocketed past 1100+ stars!
As a reminder, LocalAI is real FOSS (Free and Open Source Software) and its sibling projects are community-driven and not backed by VCs or a company. We rely on contributors donating their spare time and our sponsors to provide us the hardware! If you love open-source, privacy-first AI, please consider starring the repository, contributing code, reporting bugs, or spreading the word!
Full changelog :point_down:
:point_right: Click to expand :point_left:
What's Changed
Bug fixes :bug:
Exciting New Features 🎉
libutf8libs by @mudler in fix(llama-cpp/darwin): make sure to bundlelibutf8libs mudler/LocalAI#6060🧠 Models
📖 Documentation and examples
👒 Dependencies
Other Changes
5527454cdb3e15d7e2b8a6e2afcb58cb61651fd2by @localai-bot in chore: ⬆️ Update ggml-org/whisper.cpp to5527454cdb3e15d7e2b8a6e2afcb58cb61651fd2mudler/LocalAI#6047f4586ee5986d6f965becb37876d6f3666478a961by @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp tof4586ee5986d6f965becb37876d6f3666478a961mudler/LocalAI#604816c2924cb2c4b5c9f79220aa7708eb5b346b029bby @localai-bot in chore: ⬆️ Update ggml-org/whisper.cpp to16c2924cb2c4b5c9f79220aa7708eb5b346b029bmudler/LocalAI#605529c8fbe4e05fd23c44950d0958299e25fbeabc5cby @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp to29c8fbe4e05fd23c44950d0958299e25fbeabc5cmudler/LocalAI#6054040510a132f0a9b51d4692b57a6abfd8c9660696by @localai-bot in chore: ⬆️ Update ggml-org/whisper.cpp to040510a132f0a9b51d4692b57a6abfd8c9660696mudler/LocalAI#60695e6229a8409ac786e62cb133d09f1679a9aec13eby @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp to5e6229a8409ac786e62cb133d09f1679a9aec13emudler/LocalAI#60701fe00296f587dfca0957e006d146f5875b61e43dby @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp to1fe00296f587dfca0957e006d146f5875b61e43dmudler/LocalAI#607921c17b5befc5f6be5992bc87fc1ba99d388561dfby @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp to21c17b5befc5f6be5992bc87fc1ba99d388561dfmudler/LocalAI#60846d7f1117e3e3285d0c5c11b5ebb0439e27920082by @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp to6d7f1117e3e3285d0c5c11b5ebb0439e27920082mudler/LocalAI#6088fc45bb86251f774ef817e89878bb4c2636c8a58fby @localai-bot in chore: ⬆️ Update ggml-org/whisper.cpp tofc45bb86251f774ef817e89878bb4c2636c8a58fmudler/LocalAI#6089fb22dd07a639e81c7415e30b146f545f1a2f2cafby @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp tofb22dd07a639e81c7415e30b146f545f1a2f2cafmudler/LocalAI#61127a6e91ad26160dd6dfb33d29ac441617422f28e7by @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp to7a6e91ad26160dd6dfb33d29ac441617422f28e7mudler/LocalAI#6116cd36b5e5c7fed2a3ac671dd542d579ca40b48b54by @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp tocd36b5e5c7fed2a3ac671dd542d579ca40b48b54mudler/LocalAI#6118710dfc465a68f7443b87d9f792cffba00ed739feby @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp to710dfc465a68f7443b87d9f792cffba00ed739femudler/LocalAI#61267745fcf32846006128f16de429cfe1677c963b30by @localai-bot in chore: ⬆️ Update ggml-org/whisper.cpp to7745fcf32846006128f16de429cfe1677c963b30mudler/LocalAI#6136043fb27d3808766d8ea8195bbd12359727264402by @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp to043fb27d3808766d8ea8195bbd12359727264402mudler/LocalAI#6137c4e9239064a564de7b94ee2b401ae907235a8fcaby @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp toc4e9239064a564de7b94ee2b401ae907235a8fcamudler/LocalAI#61398b696861364360770e9f61a3422d32941a477824by @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp to8b696861364360770e9f61a3422d32941a477824mudler/LocalAI#6151fbef0fad7a7c765939f6c9e322fa05cd52cf0c15by @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp tofbef0fad7a7c765939f6c9e322fa05cd52cf0c15mudler/LocalAI#6155c97dc093912ad014f6d22743ede0d4d7fd82365aby @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp toc97dc093912ad014f6d22743ede0d4d7fd82365amudler/LocalAI#61633d16b29c3bb1ec816ac0e782f20d169097063919by @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp to3d16b29c3bb1ec816ac0e782f20d169097063919mudler/LocalAI#6165e92d53b29e393fc4c0f9f1f7c3fe651be8d36faaby @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp toe92d53b29e393fc4c0f9f1f7c3fe651be8d36faamudler/LocalAI#61694c6475f9176bf99271ccf5a2817b30a490b83db0by @localai-bot in chore: ⬆️ Update leejet/stable-diffusion.cpp to4c6475f9176bf99271ccf5a2817b30a490b83db0mudler/LocalAI#6171d4d8dbe383e8b9600cbe8b42016e3a4529b51219by @localai-bot in chore: ⬆️ Update ggml-org/llama.cpp tod4d8dbe383e8b9600cbe8b42016e3a4529b51219mudler/LocalAI#6172New Contributors
Full Changelog: mudler/LocalAI@v3.4.0...v3.5.0
View the full release notes at https://github.com/mudler/LocalAI/releases/tag/v3.5.0.