Decentralised, Self-hosted, Minable Processing Units For LLM Models

Powered By

Decentralised Ownership

Artificial Intelligence For Everyone

LocalAI uses open-source software stack to enable everyone to run their own machine learning models. Our goal is to give unlimited access to machine learning resources that can be run by anyone.

Powered By

Use open-source models supporting CodeLlama

LocalAI enables developers to run their own and community created open-source models directly in their code supporting up to 70 billion parameters

ollama run codellama '
Where is the bug in this code?

def fib(n):
    if n <= 0:
        return n
    else:
        return fib(n-1) + fib(n-2)
curl -X POST http://localhost:11434/api/generate -d '{
  "model": "codellama",
  "prompt": "Write a function that outputs the fibonacci sequence"
}'

def fib(n):
    if n <= 0:
        return n
    else:
        return fib(n-1) + fib(n-2)

Interact with LLMs via your command line

LocalAI enables developers to run their own and community created open-source models directly in their code

Meet LocalAI Chat Agent

LocalAI chat agent is a state of art LLM trained using the first decentralised GPU hardware. It demonstrates how the power of LocalAI can be used to create high performing LLM models

LocalAI On Ethereum

Token powering LocalAI Architecture

By owning $LOCAI, the cryptocurrency backing LocalAI architecture, you get access to latest releases and support the platform growth through multiple offered initiatives.

LocalAI