How AI Chips are Transforming Tech

 


According to Nikkei Asia, Arm Holdings of the SoftBank Group intends to compete with Nvidia and Apple by releasing AI chips in 2025.


According to the story, UK-based Arm plans to launch an AI chip company and produce a working prototype by the spring of 2025. Contract manufacturers plan to start mass manufacturing in October 2025, according to a report from Nikkei Asia.



According to the story, Arm and SoftBank will pay for the first development costs, which could run into the hundreds of billions of yen.

As soon as a mass-production infrastructure is established, SoftBank is reportedly in talks with Taiwan Semiconductor Manufacturing Corp (TSMC) and other companies to purchase production capacity for the AI chip industry.

SoftBank and Arm declined to comment, and TSMC took a bit to respond.

Artificial Intelligence Chips

AI will have an effect on both domestic and global security in the future. The US government is researching measures to restrict the spread of AI knowledge and technology. The concentration of controls in modern AI systems naturally lies in the computer hardware, as general-purpose AI software, datasets, and algorithms are not successful. The key to modern AI is computation on a scale that was unthinkable only a few years ago.

It can take a month and cost $100 million to train a state-of-the-art AI programme. High-capacity computer chips including those with the most transistors and those tailored for particular tasks are necessary for AI systems. To scale AI economically, state-of-the-art, specialised "AI chips" are required; general-purpose, older chips can cost tens of thousands of times more. Because the sophisticated supply chains required to produce cutting-edge AI chips are centralised in the US and a few allied countries, export control laws are feasible.


This report goes into detail on the aforementioned story. It addresses the purpose, widespread use, and significance of AI chips. It clarifies why the cost of cutting-edge, AI-specific processors is lower than that of previous generations. The report covers the trends in semiconductor industry and AI chip design that are influencing the development of chips and AI chips. The technological and financial aspects that influence the cost-effectiveness of AI applications are also summarised.

AI is defined in this study as state-of-the-art computationally costly AI systems, such as deep neural networks. Recent AI triumphs, such as DeepMind's AlphaGo, which vanquished the world Go champion, are the result of DNNs. As previously stated, "AI chips" are computer chips that perform fast and effectively for AI-specific computations but not well for general calculations.

We will talk about AI chips and the reasons behind their necessity for developing and implementing AI on a broad scale. It is not focused on export control targets or the supply chain for AI chips. The semiconductor supply chain, national competitiveness, China's semiconductor industry's prospects for supply chain localization, and policies the US and its allies can pursue to maintain their advantages in the production of AI chips will all be covered in future CSET reports. They will also offer recommendations on how to use these policies to further the development and adoption of AI technology.

Trends in the Industry Suggest AI Chips Above General-Purpose Chips

According to Moore’s Law, between 1960 and 2010, transistor shrinkage caused computer chip transistors to double every two years. As a result, computer processors became millions of times more effective and faster.

A few atoms width transistors are used in modern chips. However, shrinking transistors increases the capital and labour costs associated with the semiconductor business by making engineering challenges more difficult or unachievable. Transistor density takes longer to double because Moore's Law is slowing down. The main reason Moore's Law justifies expenses is that it enables advancements in chips such as increased transistor efficiency, speed, and specialised circuits.


The economies of scale that supported general-purpose technologies like central processing units have been undermined by the demand for specialised applications like artificial intelligence (AI) and the halting of Moore's Law-driven CPU improvements. AI chips are gaining market share from CPUs.

Basics of AI Chips

  • AI chips include GPUs, FPGAs, and AI-specific ASICs.
  • General-purpose devices like CPUs can be used for basic AI tasks, but as AI advances, CPUs become less useful.
  • It uses a large number of small transistors, which run faster and use less energy, to perform more computations per unit of energy, just like general-purpose CPUs.
  • Unlike CPUs, AI chips have a variety of AI-optimized design components.
  • These qualities significantly speed up the identical, predictable, and independent computations required by AI algorithms.
  • They include using programming languages made to effectively translate AI computer code for execution, parallelizing multiple calculations rather than performing them sequentially like CPUs do, and implementing AI algorithms with a low degree of precision but fewer transistors required for the same calculation. 
  • For example, storing an entire AI algorithm on a single AI chip speeds up memory access.
  • AI chips vary in what they can achieve. During "training," the majority of AI algorithms are created and improved on GPUs.
  • FPGAs are typically utilised for "inference," which involves applying acquired AI algorithms to actual data. ASICs for inference or training are feasible.

Why AI Needs State-of-the-Art Chips

Because of their special characteristics, AI chips can train and infer AI algorithms tens or thousands of times quicker and more effectively than CPUs. Modern AI algorithms are far less expensive than CPUs because of their efficiency. 26 years of Moore's Law CPU advancements are equal to a thousand-times more efficient AI chip.

Cutting-edge AI processors are required for modern AI systems. Due to their higher energy consumption, older AI circuits with bigger, slower, and more power-hungry transistors soon become expensive. Older AI chips therefore cost more and perform slower than newer ones. Because of the dynamics of cost and speed, state-of-the-art AI algorithms require modern AI processors to be developed and implemented.


Even with state-of-the-art gear, training an AI system can take weeks and cost tens of millions of dollars. A large portion of top AI lab spending is allocated to computing relevant to AI. Research and implementation would be unfeasible because this training would need orders of magnitude more time and money on general-purpose devices like CPUs or older AI processors. Using less sophisticated or specialised circuits for inference could result in higher costs and much longer processing times.

Consequences for the Competitiveness of National AI

For efficient and quick development and deployment, state-of-the-art AI processors are necessary for advanced security-related AI systems. In the various semiconductor industries required to produce these gadgets, the United States and its allies enjoy an advantage. American companies are the leaders in AI chip design, including EDA software.

Chinese AI chip designers utilise American EDA software, putting them behind. A Chinese company has recently acquired some capacity, but the majority of chip fabrication units, or "fabs," that can produce cutting-edge AI chips are controlled by American, Taiwanese, and South Korean businesses.

Chinese AI chip designers outsource production to more capable and higher-quality non-Chinese fabs. In the semiconductor manufacturing equipment (SME) market for fabs, American, Dutch, and Japanese firms hold a dominant position. China may lose these benefits if it succeeds in building a sophisticated chip industry.


The US and its allies must continue to preserve its production edge because modern AI chips are essential to national security. Future CSET papers will look at ways that the US and its allies keep a competitive edge and will explore points of control to make sure that the development and application of AI technology contributes to global stability and benefits all parties.

News Source:  AI chips

Post a Comment

0 Comments