Skip to content

Conversation

BrewTestBot
Copy link
Member

@BrewTestBot BrewTestBot commented Jun 19, 2025

Created by brew bump


Created with brew bump-formula-pr.

  • resource blocks have been checked for updates.
release notes




πŸš€ LocalAI 3.0 – A New Era Begins

Say hello to LocalAI 3.0 β€” our most ambitious release yet!

We’ve taken huge strides toward making LocalAI not just local, but limitless. Whether you're building LLM-powered agents, experimenting with audio pipelines, or deploying multimodal backends at scale β€” this release is for you.

Let’s walk you through what’s new. (And yes, there’s a lot to love.)

TL;DR – What’s New in LocalAI 3.0.0 πŸŽ‰

  • 🧩 Backend Gallery: Install/remove backends on the fly, powered by OCI images β€” fully customizable and API-driven.
  • πŸŽ™οΈ Audio Support: Upload audio, PDFs, or text in the UI β€” plus new audio understanding models like Qwen Omni.
  • 🌐 Realtime API: WebSocket support compatible with OpenAI clients, great for chat apps and agents.
  • 🧠 Reasoning UI Boosts: Thinking indicators now show in chat for smart models.
  • πŸ“Š Dynamic VRAM Handling: Smarter GPU usage with automatic offloading.
  • πŸ¦™ Llama.cpp Upgrades: Now with reranking + multimodal via libmtmd.
  • πŸ“¦ 50+ New Models: Huge model gallery update with fresh LLMs across categories.
  • 🐞 Bug Fixes: Streamed runes, template stability, better backend gallery UX.
  • ❌ Deprecated: Extras images β€” replaced by the new backend system.

πŸ‘‰ Dive into the full changelog and docs below to explore more!

🧩 Introducing the Backend Gallery β€” Plug, Play, Power Up

Screenshot From 2025-06-19 16-48-58

No more hunting for dependencies or custom hacks.

With the new Backend Gallery, you can now:

  • Install & remove backends at runtime or startup via API or directly from the WebUI
  • Use custom galleries, just like you do for models
  • Enjoy zero-config access to the default LocalAI gallery

Backends are standard OCI images β€” portable, composable, and totally DIY-friendly. Goodbye to "extras images" β€” hello to full backend modularity, even with Python-based dependencies.

πŸ“– Explore the Backend Gallery Docs

⚠️ Important: Breaking Changes

From this release we will stop pushing -extra images containing python backends. You can now use standard images, and you will have only to pick the ones that are suited for your GPU. Additional backends can be installed via the backend gallery.

Here below some examples, note that the CI is still publishing the images so won't be available until jobs are processed, and the installation scripts will be updated right after images are publicly available.

CPU only image:

docker run -ti --name local-ai -p 8080:8080 localai/localai:latest

NVIDIA GPU Images:

# CUDA 12
docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-12

# CUDA 11
docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-11

# NVIDIA Jetson (L4T) ARM64
docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-nvidia-l4t-arm64

AMD GPU Images (ROCm):

docker run -ti --name local-ai -p 8080:8080 --device=/dev/kfd --device=/dev/dri --group-add=video localai/localai:latest-gpu-hipblas

Intel GPU Images (oneAPI):

# Intel GPU with FP16 support
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-gpu-intel-f16

# Intel GPU with FP32 support
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-gpu-intel-f32

Vulkan GPU Images:

docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-gpu-vulkan

AIO Images (pre-downloaded models):

# CPU version
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu

# NVIDIA CUDA 12 version
docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-aio-gpu-nvidia-cuda-12

# NVIDIA CUDA 11 version
docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-aio-gpu-nvidia-cuda-11

# Intel GPU version
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-gpu-intel-f16

# AMD GPU version
docker run -ti --name local-ai -p 8080:8080 --device=/dev/kfd --device=/dev/dri --group-add=video localai/localai:latest-aio-gpu-hipblas

For more information about the AIO images and pre-downloaded models, see Container Documentation.

🧠 Smarter Reasoning, Smoother Chat

Screenshot From 2025-06-19 16-58-27

  • Realtime WebSocket API: OpenAI-style streaming support via WebSocket is here. Ideal for agents and chat apps.
  • "Thinking" Tags: Reasoning models now show a visual "thinking" box during inference in the UI. Intuitive and satisfying.

🧠 Model Power-Up: VRAM Savvy + Multimodal Brains

Dynamic VRAM Estimation: LocalAI now adapts and offloads layers depending on your GPU’s capabilities. Optimal performance, no guesswork.
Llama.cpp upgrades also includes:

  • reranking
  • Enhanced multimodal support via libmtmd

πŸ§ͺ New Models!

More than 50 new models joined the gallery, including:

  • 🧠 skywork-or1-32b, rivermind-lux-12b, qwen3-embedding-*, llama3-24b-mullein, ultravox-v0_5, and more
  • 🧬 Multimodal, reasoning, and domain-specific LLMs for every need
  • πŸ“¦ Browse the latest additions in the Model Gallery

🐞 Bugfixes & Polish

  • Rune streaming is now buttery smooth
  • Countless fixes across templates, inputs, CI, and realtime session updates
  • Backend gallery UI is more stable and informative

The Complete Local Stack for Privacy-First AI

With LocalAGI rejoining LocalAI alongside LocalRecall, our ecosystem provides a complete, open-source stack for private, secure, and intelligent AI operations:

LocalAI Logo

LocalAI

The free, Open Source OpenAI alternative. Acts as a drop-in replacement REST API compatible with OpenAI specifications for local AI inferencing. No GPU required.

Link: https://github.com/mudler/LocalAI

LocalAGI Logo

LocalAGI

A powerful Local AI agent management platform. Serves as a drop-in replacement for OpenAI's Responses API, supercharged with advanced agentic capabilities and a no-code UI.

Link: https://github.com/mudler/LocalAGI

LocalRecall Logo

LocalRecall

A RESTful API and knowledge base management system providing persistent memory and storage capabilities for AI agents. Designed to work alongside LocalAI and LocalAGI.

Link: https://github.com/mudler/LocalRecall

Join the Movement! ❀️

A massive THANK YOU to our incredible community and our sponsors! LocalAI has over 33,300 stars, and LocalAGI has already rocketed past 750+ stars!

As a reminder, LocalAI is real FOSS (Free and Open Source Software) and its sibling projects are community-driven and not backed by VCs or a company. We rely on contributors donating their spare time and our sponsors to provide us the hardware! If you love open-source, privacy-first AI, please consider starring the repos, contributing code, reporting bugs, or spreading the word!

πŸ‘‰ Check out the reborn LocalAGI v2 today: https://github.com/mudler/LocalAGI

LocalAI 3.0.0 is here. What will you build next?

Full changelog :point_down:

:point_right: Click to expand :point_left:

What's Changed

Breaking Changes πŸ› 

Bug fixes :bug:

Exciting New Features πŸŽ‰

🧠 Models

πŸ“– Documentation and examples

πŸ‘’ Dependencies

Other Changes

New Contributors

Full Changelog: mudler/LocalAI@v2.29.0...v3.0.0-alpha1

View the full release notes at https://github.com/mudler/LocalAI/releases/tag/v3.0.0.


@github-actions github-actions bot added go Go use is a significant feature of the PR or issue bump-formula-pr PR was created using `brew bump-formula-pr` labels Jun 19, 2025
@chenrui333
Copy link
Member

chenrui333 commented Jun 19, 2025

  fatal: not a git repository (or any of the parent directories): .git
  rm -rf local-ai || true
  CGO_LDFLAGS=" -framework Accelerate -framework Foundation -framework Metal -framework MetalKit -framework MetalPerformanceShaders" go build -ldflags "-s -w -X "github.com/mudler/LocalAI/internal.Version=3.0.0" -X "github.com/mudler/LocalAI/internal.Commit="" -tags "" -o local-ai ./
  rice append --exec local-ai
  make: rice: No such file or directory
  make: *** [build] Error 1

@chenrui333 chenrui333 added the build failure CI fails while building the software label Jun 19, 2025
@chenrui333 chenrui333 force-pushed the bump-localai-3.0.0 branch from 943511d to 9b62270 Compare June 19, 2025 20:14
@chenrui333 chenrui333 changed the title localai 3.0.0 go-rice 1.0.3 (new formula) localai 3.0.0 Jun 19, 2025
@github-actions github-actions bot added the new formula PR adds a new formula to Homebrew/homebrew-core label Jun 19, 2025
@chenrui333 chenrui333 added ready to merge PR can be merged once CI is green and removed new formula PR adds a new formula to Homebrew/homebrew-core build failure CI fails while building the software labels Jun 19, 2025
@chenrui333 chenrui333 mentioned this pull request Jun 19, 2025
1 task
@chenrui333 chenrui333 force-pushed the bump-localai-3.0.0 branch from 9b62270 to 03c3dd6 Compare June 19, 2025 20:27
@github-actions github-actions bot added the new formula PR adds a new formula to Homebrew/homebrew-core label Jun 19, 2025
chenrui333 and others added 2 commits June 19, 2025 16:34
localai: update build

Signed-off-by: Rui Chen <[email protected]>
@chenrui333 chenrui333 force-pushed the bump-localai-3.0.0 branch from 03c3dd6 to 6f81b90 Compare June 19, 2025 20:34
Copy link
Contributor

πŸ€– An automated task has requested bottles to be published to this PR.

Please do not push to this PR branch before the bottle commits have been pushed, as this results in a state that is difficult to recover from. If you need to resolve a merge conflict, please use a merge commit. Do not force-push to this PR branch.

@github-actions github-actions bot added the CI-published-bottle-commits The commits for the built bottles have been pushed to the PR branch. label Jun 19, 2025
@BrewTestBot BrewTestBot enabled auto-merge June 20, 2025 00:00
@BrewTestBot BrewTestBot added this pull request to the merge queue Jun 20, 2025
Merged via the queue into master with commit 0e5aca2 Jun 20, 2025
17 checks passed
@BrewTestBot BrewTestBot deleted the bump-localai-3.0.0 branch June 20, 2025 00:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bump-formula-pr PR was created using `brew bump-formula-pr` CI-published-bottle-commits The commits for the built bottles have been pushed to the PR branch. go Go use is a significant feature of the PR or issue new formula PR adds a new formula to Homebrew/homebrew-core ready to merge PR can be merged once CI is green
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants