Why is the quantum chip Willow causing a sensation in the global technology industry?

Source: Macro Ze Ping

On December 10th, Google announced its latest generation quantum chip - Willow, which has caused a sensation in the global technology community, even astonishing Musk with a ‘Wow’!

How powerful is the Willow chip? How far away is it from mass production?

Google’s latest generation quantum chip Willow explodes onto the scene, with its biggest breakthrough being its supercomputing and error correction capabilities

For a benchmark task called ‘random circuit sampling’, the current fastest supercomputer takes 10 to the power of 25 years to solve, far longer than the age of the universe (26.7 billion years); while it takes less than 5 minutes for Willow to complete this task.

Quantum computing has the potential to significantly improve computing speed for specific tasks and surpass classical computers, a phenomenon known as ‘quantum supremacy’. As early as 2019, Google had already verified this fact and published it in Nature, demonstrating that its 54-qubit quantum computer, Sycamore, achieved a task that traditional architecture computers could not complete: a calculation that would take the world’s first supercomputer 10,000 years was completed by Sycamore in just 3 minutes and 20 seconds. At that time, Google CEO Sundar Pichai described this as the long-awaited ‘Hello World’ moment for researchers, marking the most meaningful milestone in practical quantum computing at that time.

The release of Willow this time is undoubtedly another landmark event in the field of quantum computing.

However, ‘fast’ is not the most remarkable breakthrough of Willow.

The biggest highlight of Willow is its strong error correction ability.

In the past, quantum chips were prone to decoherence due to the fragility of quantum states during data processing, leading to errors in the quantum bit states. Therefore, despite possessing ‘quantum supremacy’, quantum computers are susceptible to environmental interference and prone to errors. Typically, the more quantum bits there are, the more errors occur.

Therefore, “quantum error correction” has become a key technology, and quantum chips require special quantum error correction technology, which is also an important challenge in this field, and once seriously restricted the practical application and development of quantum computing.

The Willow chip has successfully solved the quantum error correction problem that has plagued researchers for nearly 30 years and achieved an exponential reduction in error rates. Google’s research shows that the more qubits used in Willow, the lower the system’s error rate.

As the number of qubits increases, from a 3×3 array to a 5×5 and then to a 7×7 array, Google’s Willow chip experiment can reduce the coding error rate by 2.14 times with each expansion, and the error rate decreases faster and faster.

2. What is quantum computing? Why is it so powerful?

In 1935, the great Austrian physicist Erwin Schrödinger proposed a thought experiment: putting a cat in a box with radioactive material, there is a 50% chance that the radioactive material will decay and release poison gas to kill the cat, and a 50% chance that the radioactive material will not decay and the cat will survive. Before opening the box, no one knows whether the cat is alive or dead, and can only be described as being in a state of superposition of life and death.

The quantum world, like Schrödinger’s cat, is in a suspended superposition; the corresponding new computing theory is “quantum computing,” and the hardware layer manifests as quantum chips, quantum computers.

Quantum computing exhibits two advantages:

**First, powerful data storage capacity.**Classic computing uses bits as the basic unit, while quantum computing uses quantum bits as the basic unit.

In classical computing, the state of a bit is deterministic, either 0 or 1; whereas a quantum bit exists in a superposition of 0 and 1, in other words, it can simultaneously store 0 and 1.

A traditional chip with 1 bit can store n pieces of data at the same time; while a chip with n quantum bits can store 2^n pieces of data at the same time.

Second, it has demonstrated strong parallel computing capabilities for specific problems. **

Conventional computers are serial calculations, and each operation can only convert a single value to another, which means that it must be calculated sequentially. A quantum computer, on the other hand, can convert 2^n data into new 2^n data at the same time in a single operation.

3. Can future quantum chips replace GPUs and drive the development of AI?

Artificial intelligence technology and various applications have developed rapidly in recent years, and the demand for computing power has also grown exponentially.

In theory, the parallel processing capability of quantum computing gives it a natural advantage in handling complex artificial intelligence algorithms, which can greatly improve the training speed and accuracy of models. The emergence of the Willow chip may provide powerful computational power for the further development of artificial intelligence.

In fact, GPUs, which are now widely used in AI, were originally designed to accelerate graphics processing. For example, 3D scene rendering in games, modeling and special effects processing in animation production, and video visual effects in film and television production. However, due to its powerful computing capabilities, GPUs were later widely used in scientific computing and artificial intelligence fields, especially in the neural network training and inference stages in deep learning, performing well in handling large-scale datasets and highly parallel computing tasks.

From this perspective, quantum chips will also gradually break through development, break computational limitations, and accelerate the training process of various AI machine learning algorithms in the future. Quantum chips are currently mainly used in specific areas with extremely high computational complexity requirements, such as breaking encryption algorithms in cryptography (such as posing potential threats to traditional encryption methods based on the RSA algorithm), quantum system simulation (simulating the physical and chemical properties of molecules, materials, etc., at the quantum level), and solving complex optimization problems (such as complex combinatorial optimization problems like logistics planning, resource allocation, etc.). In these areas, the advantages of quantum computing can be fully utilized, potentially solving tasks that traditional computers cannot complete within an acceptable time frame.

The increase in the computing power of quantum chips is mainly related to the increase in the number and quality of quantum bits. In the future, with the increase in the number of quantum bits, the computing power of quantum computers will grow exponentially. With each additional quantum bit, the number of possible state combinations doubles. For example, 2 quantum bits have 4 state combinations, 3 quantum bits have 8 state combinations, and so on. At the same time, the quality of quantum bits (such as coherence time, fidelity, etc.) also has a significant impact on computing power. High-quality quantum bits can more effectively maintain quantum states, thereby achieving more accurate and complex computations.

However, in the short term, it is difficult for quantum chips to shake the position of GPUs. Compared to GPUs, quantum chips have stronger computing power and theoretically can replace them. However, the moat of GPUs, computing power is just one aspect, more importantly: programmable architecture and developer ecosystem advantages, manufacturing processes, and industrial maturity.

**The programmable architecture and developer ecosystem of GPUs are the core barriers. **Nvidia has paved the way for more than 10 years of the “AI computing revolution” set off by GPUs.

CUDA (Compute Unified Device Architecture) is the first GPU programming architecture platform developed by NVIDIA in 2006. Its value lies in building a GPU developer ecosystem, where algorithm engineers can explore the capabilities of GPUs according to their own needs. This has also expanded the application areas of GPUs from graphics rendering to general-purpose domains.

If you develop new software based on new hardware, such as quantum chips, forward compatibility needs to be achieved. However, existing major AI software is mostly dependent on the CUDA platform, so breaking away from the CUDA architecture comes at a high cost. Coupled with the moat effect of the development community, many high-performance computing developers have accumulated development experience in the CUDA ecosystem. CUDA has an annual download volume of up to five million times, making it a decade-long project to drive the developer community to other programming models.

GPU chip manufacturing process and industrial chain are mature, with a broad consumer market and industrial positive cycle.

GPUs have been around for 25 years, and downstream commercial applications such as personal PCs, custom development, and AI data centers have formed commercialization scenarios ranging from 10 to 30 years. Currently, it takes one year from chip project initiation to wafer production and another year from wafer production to mass production, with GPU development as the main theme, forming a corresponding linkage cycle of photolithography equipment development and wafer foundry process iteration. Such a solid industrial chain is difficult to break under more than a decade of positive circulation.

However, quantum chip manufacturing and the GPU industry chain are difficult to overlap. The design and manufacturing engineering of quantum chips are extremely complex, requiring highly purified experimental environments, precise quantum control technology, and stable quantum bits. Therefore, it has been a few top technology companies fighting alone for a long time, and a mature industry supply chain has not yet been formed. Therefore, achieving mass production and commercial application of quantum chips in the short term is a major challenge.

4. The biggest areas affected by quantum chips: cryptocurrency and “HPC+AI”

4.1 Quantum chips or the “nemesis” of cryptocurrencies

Taking Bitcoin as an example, its security is built on two key mechanisms. The first is the “mining” mechanism, where Bitcoin production relies on Proof of Work (PoW) based on a hash function. The higher the hash rate, the greater the likelihood of successful mining. The second is transaction signing, a digital signature algorithm based on elliptic curve cryptography (ECDSA), which serves as the user’s “identity wallet.” The design of these two mechanisms makes Bitcoin virtually unbreakable in traditional computing, while quantum chips will pose a direct threat to Bitcoin.

One is the brute force cracking of the ‘mining’ mechanism by quantum computing. The algorithms of quantum computing can accelerate the calculation of hash functions, i.e., speed up the mining process, with a magnitude surpassing all previous traditional devices. As a result, the success rate of mining increases, the supply of cryptocurrencies sharply increases, and this leads to significant fluctuations in their market prices. On December 10th, Bitcoin dropped from $100,000 to $94,000. According to Coinglass data, a total of 237,000 people were liquidated from December 10th to 12th.

Second, quantum computing poses a direct threat to transaction signatures. Cryptocurrency transactions involve two types of credentials: ‘public key’ and ‘private key’, where the former is like a bank card number and the latter is like a wallet password. Typically, the public exposure of a public key address does not affect the security of a user’s funds, but quantum computing can crack signatures through public keys and forge transactions. For example, the Shor algorithm in quantum computing is specifically used to crack the factorization of large integers and discrete logarithm problems, posing a serious threat to transaction signatures.

Although Willow currently poses a small threat to Bitcoin, it is highly probable that cryptocurrencies will be breached by quantum computing in the future. In theory, attacking the signature and mining mechanism of Bitcoin would require around millions of physical quantum bits, which is still a huge gap compared to the 105 physical quantum bits currently possessed by Willow. However, if Willow iterates like a general-purpose GPU, achieving mass production and computational power leap, then it is not impossible for Bitcoin to be “conquered” in the next decade.

4.2 Quantum chips will promote “HPC+AI” and promote the development of high-order artificial intelligence

According to OpenAI’s classification of AI from L1 (Chatbot) to L5 (AGI), the current development of AI large models is only in the transitional stage from L1 to L2. L5-level AGI is defined as having “organizational-level capabilities” to judge, reason, predict, and plan actions in dynamic and complex real environments. The industry believes that “HPC+AI” will be a key step in achieving AGI.

High-performance computing (HPC) refers to the use of powerful computer capabilities to solve scientific, engineering, and technical implementation problems, which is somewhat similar to today’s AI large models, but with different directions and emphases.

HPC focuses on “solving complex problems”, bringing significant scientific breakthroughs in applications such as meteorology, physics, and astronomy with supercomputers.

AI models focus on “reasoning and generation”, which are not good at solving complex models, but have good versatility.

The landing of quantum chips is a revolutionary breakthrough in the field of HPC, and the solution of complex problems no longer requires the long-term “brute force operation” of traditional HPC, but can be developed in a new direction - Combined with AI for more complex general training.

First, traditional AI training cannot process quantum bit data. Quantum computing can optimize specific learning models that traditional computing cannot handle, and construct quantum-sensitive system models. In the future, AI models will have the ability to reason and predict complex worlds, reducing or even eliminating the phenomenon of “AI illusion” compared to current large models.

The second advantage is quantum error correction technology. The Willow chip overcomes the key challenges of quantum error correction and achieves a significant reduction in error rate. The application of quantum error correction technology in high-order AI training can ensure the accuracy and reliability of models in training and processing large amounts of complex data, reducing calculation errors caused by the vulnerability of quantum bits, thereby improving the effectiveness and credibility of AI training.

Although current AI training does not yet have the conditions to apply quantum chips, it is highly likely that quantum chips will be needed as the core support for computing power in the future. Due to the extreme sensitivity of quantum bits, they are easily affected by external environmental factors, including temperature and electromagnetic fields, which may cause quantum states to decohere, thereby affecting the accuracy of computation results. Despite Willow’s progress in quantum error correction technology, in practical applications of artificial intelligence training, further improvements are needed in the stability and anti-interference performance of quantum systems to achieve long-term stable operation.

Google has released the new generation quantum computing chip Willow, causing a huge sensation in the global technology community. This is not only a major breakthrough in the field of quantum computing, but also the next global technology frontier.

The road to the future development of quantum computing technology still has many obstacles, and there are many unresolved challenges before it can be widely used for AI training.

Technological progress has never been smooth sailing, just like how GPUs have gone from obscurity to great success.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 1
  • Repost
  • Share
Comment
0/400
WatchingFishInHuagangvip
· 2024-12-12 01:54
Source: Zeping Macro On December 10th, Google unveiled its latest generation of quantum chips - Willow, causing a sensation in the global tech community, even Musk exclaimed 'Wow'! What makes Willow chip so powerful? How far away is it from mass production? 1 Google's latest generation of quantum chip Willow is explosively launched, the biggest breakthrough lies in its powerful computing capability and error correction ability For a Benchmark task called 'random circuit sampling,' the fastest Supercomputer currently requires 10 to the power of 25
View OriginalReply0
  • Pin

Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)