CoinWorld News, Anthropic co-founder and former OpenAI Policy Director Jack Clark mentioned the importance of decentralized training in his weekly AI newsletter Import AI. He stated that decentralized training can improve data privacy and system robustness by distributing learning across multiple nodes. He cited a research report from Epoch AI, which analyzed over 100 related papers and pointed out that: the computational scale of decentralized training is growing at a rate of 20 times per year (far higher than the 5 times annual growth rate of cutting-edge centralized training). Currently, decentralized training is still about 1000 times smaller than cutting-edge centralized training, but it is technically feasible and may support broader collective development of more powerful models.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
CoinWorld News, Anthropic co-founder and former OpenAI Policy Director Jack Clark mentioned the importance of decentralized training in his weekly AI newsletter Import AI. He stated that decentralized training can improve data privacy and system robustness by distributing learning across multiple nodes. He cited a research report from Epoch AI, which analyzed over 100 related papers and pointed out that: the computational scale of decentralized training is growing at a rate of 20 times per year (far higher than the 5 times annual growth rate of cutting-edge centralized training). Currently, decentralized training is still about 1000 times smaller than cutting-edge centralized training, but it is technically feasible and may support broader collective development of more powerful models.