Decentralised, Self-hosted, Minable Processing Units For LLM Models
Powered By
Decentralised Ownership
Artificial Intelligence For Everyone
LocalAI uses open-source software stack to enable everyone to run their own machine learning models. Our goal is to give unlimited access to machine learning resources that can be run by anyone.
Powered By
Use open-source models supporting CodeLlama
LocalAI enables developers to run their own and community created open-source models directly in their code supporting up to 70 billion parameters
Interact with LLMs via your command line
LocalAI enables developers to run their own and community created open-source models directly in their code
Meet LocalAI Chat Agent
LocalAI chat agent is a state of art LLM trained using the first decentralised GPU hardware. It demonstrates how the power of LocalAI can be used to create high performing LLM models
LocalAI On Ethereum
Token powering LocalAI Architecture
By owning $LOCAI, the cryptocurrency backing LocalAI architecture, you get access to latest releases and support the platform growth through multiple offered initiatives.
Buy $LOCAI
Chart