Tether Launches World's First Billion-Level AI Training Framework for Mobile, Compatible with iPhone and Samsung LoRA

Tether’s Data and AI division QVAC announced a major technological breakthrough on March 17, launching the world’s first cross-platform LoRA fine-tuning framework supporting Microsoft’s BitNet (1-bit LLM) architecture. This technology integrated into QVAC Fabric significantly reduces memory and computational requirements, making billion-parameter models no longer exclusive to enterprise-level GPUs, enabling “local, fully private” training on ordinary smartphones and laptops.
(Background: Tether invests in Axiym to expand payment infrastructure: promoting USDT integration into global compliant payment networks)
(Additional context: Tether crosses into AI sleep technology! Leading a $50 million investment in Eight Sleep, valuation surges to $1.5 billion)

Table of Contents

Toggle

  • The Magic of 1-bit Architecture: Making Mobile Performance “Small but Mighty”
  • Real-World Data Exposure: The Surprising Speed of Samsung S25 and iPhone 16
  • Say Goodbye to API Keys, Building 100% Private Personal AI

In the field of artificial intelligence (AI), training powerful models has long been considered “money-burning,” heavily relying on expensive NVIDIA systems or cloud computing. However, stablecoin giant Tether is trying to rewrite this rule with technology. Tether’s technical arm, “Tether Data,” announced on March 17 the launch of the world’s first cross-platform BitNet LoRA fine-tuning framework for its QVAC (QuantumVerse Automatic Computer) platform.

The core value of this technology is that it allows AI models with “billion-parameter” scale to perform personalized learning directly on smartphones in users’ pockets.

The Magic of 1-bit Architecture: Making Mobile Performance “Small but Mighty”

This breakthrough is based on Microsoft’s BitNet 1-bit LLM architecture. Through optimizations in QVAC Fabric, the memory footprint and computational load of BitNet models are reduced to extremely low levels. According to the announcement, the framework supports not only common NVIDIA GPUs but also achieves full compatibility with Intel, AMD, Apple M-series chips, and mobile GPUs such as Adreno (Android), Mali, and Apple Bionic.

This means AI that previously could only run in data centers can now be fine-tuned on your phone using “Low-Rank Adaptation (LoRA).” Tether states that this technology enables edge devices to handle models twice as large as traditional Q4 quantized models, demonstrating extreme memory efficiency.

Real-World Data Exposure: The Surprising Speed of Samsung S25 and iPhone 16

Tether’s engineering team shared exciting real-world data showcasing the framework’s capabilities on modern smartphones:

  • 1.25 billion parameters: Fine-tuning a dataset of 300 biomedical documents on Samsung S25 in about 10 minutes.
  • 1 billion (1B) parameters: Completing the same fine-tuning task in 1 hour 18 minutes on Samsung S25, and 1 hour 45 minutes on iPhone 16.
  • Extreme challenge: The development team successfully ran and fine-tuned a 13 billion (13B) parameter model on iPhone 16, pushing the physical limits of mobile devices.

Say Goodbye to API Keys, Building 100% Private Personal AI

Tether CEO Paolo Ardoino has emphasized: “If you need an API key to use AI, then it doesn’t truly belong to you.” The core philosophy of QVAC is “Local-first.”

With the BitNet LoRA framework, users can let AI learn directly from local emails, notes, and messages without uploading any data to cloud servers. This not only alleviates concerns over sensitive data misuse but also breaks the monopoly of AI development by a few giants. Currently, QVAC Fabric LLM is released as open-source software (Apache 2.0 license), with pre-configured adapters available on Hugging Face, enabling developers worldwide to immediately start this edge computing revolution.

View Original
Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Comment
0/400
No comments