Biojie News reports that Anthropic co-founder and former OpenAI policy director Jack Clark mentioned the importance of decentralized training in his weekly AI news newsletter Import AI. He stated that decentralized training can improve data privacy and system robustness through distributed learning across multiple nodes. He cited a research report from Epoch AI that analyzed over 100 related papers, pointing out that the computational scale of decentralized training is growing at a rate of 20x per year (significantly higher than the 5x annual growth rate of frontier centralized training). Currently, decentralized training is still approximately 1000x smaller than frontier centralized training, but is technically feasible and could potentially support broader collective development of more powerful models.
Lihat Asli
Halaman ini mungkin berisi konten pihak ketiga, yang disediakan untuk tujuan informasi saja (bukan pernyataan/jaminan) dan tidak boleh dianggap sebagai dukungan terhadap pandangannya oleh Gate, atau sebagai nasihat keuangan atau profesional. Lihat Penafian untuk detailnya.
Biojie News reports that Anthropic co-founder and former OpenAI policy director Jack Clark mentioned the importance of decentralized training in his weekly AI news newsletter Import AI. He stated that decentralized training can improve data privacy and system robustness through distributed learning across multiple nodes. He cited a research report from Epoch AI that analyzed over 100 related papers, pointing out that the computational scale of decentralized training is growing at a rate of 20x per year (significantly higher than the 5x annual growth rate of frontier centralized training). Currently, decentralized training is still approximately 1000x smaller than frontier centralized training, but is technically feasible and could potentially support broader collective development of more powerful models.