
The Weights AI app (often referred to as Weights or Weights.gg mobile app) is a powerful Android and iOS application that brings open-source large language models (LLMs) directly to your smartphone. It allows users to run, chat with, and even lightly fine-tune a wide range of uncensored, high-performance models (Llama 3.1, Mistral, Qwen, Gemma 2, Phi-3.5, DeepSeek, and many others) locally or via cloud inference—all without needing a powerful desktop GPU. The app focuses on privacy, offline capability (for smaller models), fast local inference on mid-to-high-end phones, and an intuitive chat interface, making advanced AI accessible to mobile users, developers, roleplayers, writers, and privacy-conscious individuals.
Is Weights AI App Free or Paid?
The Weights AI app is primarily free to download and use for core functionality. Most open-source models and local inference are available at no cost (limited only by your device hardware). Paid options exist for cloud-accelerated inference (faster speeds on large models), access to the biggest models (70B–405B), higher daily token limits, priority queue during peak times, and premium uncensored model variants—making paid plans valuable for heavy mobile users or those without high-end phones.
Weights AI App Pricing Details
Weights AI app pricing is subscription-based (monthly or annual billing with discounts for yearly) and focused on cloud inference credits/tokens rather than locking core local usage.
| Plan Name | Price (Monthly / Yearly) | Main Features | Best For |
|---|---|---|---|
| Free | $0 | Local inference on device-supported models (up to ~13B comfortably), unlimited offline chat, basic model library, standard speed | Casual users, privacy-focused individuals, low-end phone owners, light daily chatting |
| Starter / Basic | ~$4.99–$9.99 / month (annual ~$50–$90) | Cloud access to larger models (up to 70B), higher daily token limits, faster cloud inference, no throttling on small models | Regular mobile users, roleplayers, writers needing bigger models without a PC |
| Pro / Premium | ~$14.99–$29.99 / month (annual ~$150–$290) | Very high token limits, access to 405B-class models, priority cloud queue, longer context windows, advanced uncensored variants, API credits | Power users, developers prototyping on mobile, creators needing consistent high-quality output |
| Enterprise / Custom | Custom (contact team) | Dedicated cloud capacity, private model hosting, team accounts, SLA support, custom integrations | Businesses, agencies, or teams using mobile AI inference at scale |
Also Read-Brave AI Free, Alternative, Pricing, Pros and Cons
Best Alternatives to Weights AI App
Several mobile or cross-platform apps and platforms offer open-source LLM access, local inference, or uncensored chat experiences with different hardware requirements, model selections, or pricing approaches.
| Alternative Tool Name | Free or Paid | Key Feature | How it compares to Weights AI App |
|---|---|---|---|
| MLC Chat / MLC LLM | Free (open-source) | Fully local inference on Android/iOS with optimized models | Completely offline & privacy-first; requires more technical setup & model conversion vs Weights’ polished app experience |
| Perplexity Mobile | Freemium + Paid Pro | AI search + chat with web access & citations | Excellent real-time knowledge; more search/research-oriented vs Weights’ pure uncensored LLM focus |
| Grok (xAI) Mobile | Free + Premium | Uncensored Grok models with humor & real-time info | Strong personality & knowledge; tied to X ecosystem vs Weights’ broad open-source model choice |
| Faraday.dev | Free (desktop/mobile) | Local uncensored chat with many models | Good offline privacy; more desktop-focused & less mobile-optimized than Weights |
| Ollama Mobile wrappers (e.g., Enclave, LLM Farm) | Free (open-source) | Run Ollama-compatible models on phone | Maximum model flexibility & privacy; technical setup & slower on mid-range phones vs Weights’ ready-to-use interface |
| Venice.ai Mobile | Free + Paid | Uncensored web chat optimized for mobile | Very similar free uncensored experience; cloud-based vs Weights’ strong local inference capability |
Pros and Cons of Weights AI App
Pros:
- Runs powerful open-source LLMs directly on your phone (local inference) for privacy and offline use
- Completely free core experience—most models and chatting require no payment
- Huge selection of uncensored, high-quality models (Llama, Mistral, Qwen, Gemma, etc.)
- Clean, mobile-optimized chat interface that feels fast and modern
- Affordable paid cloud boost for larger models or faster speeds when phone hardware is limited
- Supports creative, roleplay, coding, writing, and uncensored use cases without heavy filters
Cons:
- Local inference performance heavily depends on phone hardware—mid-range devices struggle with >13B models
- Free tier has daily token/message limits on cloud inference (resets every 24h)
- Largest models (70B–405B) usually require paid cloud tier for usable speed on mobile
- No built-in long-term memory across sessions in basic mode
- Occasional model loading delays or memory pressure on lower-end phones
- Primarily English-focused—multilingual performance varies significantly by model