Stablecoin issuer Tether announced today (17th) a major technological breakthrough for its AI infrastructure QVAC Fabric: the world’s first cross-platform supported BitNet LoRA fine-tuning framework. This allows large language models (LLMs), which previously required enterprise-grade GPUs and cloud computing power, to now be trained and inferred on consumer-grade hardware, including smartphones.
Smartphones can now train LLMs: 1B models completed within 1 hour
According to data released by Tether, the framework has successfully achieved BitNet model fine-tuning on various devices, including the common Samsung S25 and iPhone 16.
Samsung S25 (Adreno GPU):
iPhone 16 (Apple GPU):
Extreme testing can handle models up to 13 billion parameters for fine-tuning
Previously, AI training tasks were performed using high-end NVIDIA GPUs, but now they have been compressed to edge devices like smartphones.
Key Technologies: BitNet + LoRA — Cutting AI costs drastically
The core of this breakthrough lies in the combination of two technologies:
BitNet (1-bit LLM):
LoRA (Low-Rank Adaptation):
Together, these enable models to operate in extremely low-resource environments.
Real-world tests show that BitNet-1B uses 77.8% less VRAM than Gemma-3-1B and 65.6% less than Qwen3-0.6B. On the same hardware, this allows running models approximately twice as large.
GPU unlocking for mobile AI: performance boosted up to 11 times
Another key breakthrough by QVAC is enabling BitNet to run on “non-NVIDIA” ecosystems. It supports GPUs from AMD, Intel, Apple Silicon, and even mobile GPUs like Adreno, Mali, and Apple Bionic.
Large language models are no longer the exclusive domain of tech giants; AI can now be decentralized
Tether CEO Paolo Ardoino stated: “Intelligence will be a key factor in future societal development. It has the potential to enhance social stability, serve as a bridge connecting society, or further empower a select few. The future of AI should be accessible, usable, and available to everyone, not monopolized by a handful of cloud service providers with enormous resources.”
Traditional AI development heavily relies on cloud and large GPU clusters, which are costly and concentrated among a few tech giants. Tether’s QVAC platform supports meaningful large-scale model training on consumer hardware, including smartphones, demonstrating that advanced AI can be decentralized and highly inclusive. In the coming months, they will continue investing significant resources and funding to ensure AI is accessible anytime, anywhere on local devices.
This article: AI is no longer a patent of tech giants! Tether launches QVAC — the era of everyone having an LLM? Originally published on Chain News ABMedia.