Front page layout
Sign up or login to join the discussions!
Will Knight, wired.com –
Samsung is using artificial intelligence to automate the insanely complex and subtle process of designing cutting-edge computer chips.
The South Korean giant is one of the first chipmakers to use AI to create its chips. Samsung is using AI features in new software from Synopsys, a leading chip design software firm used by many companies. “What you’re seeing here is the first of a real commercial processor design with AI,” says Aart de Geus, the chairman and co-CEO of Synopsys.
Others, including Google and Nvidia, have talked about designing chips with AI. But Synopsys’ tool, called DSO.ai, may prove the most far-reaching because Synopsys works with dozens of companies. The tool has the potential to accelerate semiconductor development and unlock novel chip designs, according to industry watchers.
Synopsys has another valuable asset for crafting AI-designed chips: years of cutting-edge semiconductor designs that can be used to train an AI algorithm.
A spokesperson for Samsung confirms that the company is using Synopsys AI software to design its Exynos chips, which are used in smartphones, including its own branded handsets, as well as other gadgets. Samsung unveiled its newest smartphone, a foldable device called the Galaxy Z Fold3, earlier this week. The company did not confirm whether the AI-designed chips have gone into production yet, or what products they may appear in.
Across the industry, AI appears to be changing the way chips are made.
A Google research paper published in June described using AI to arrange the components on the Tensor chips that it uses to train and run AI programs in its data centers. Google’s next smartphone, the Pixel 6, will feature a custom chip manufactured by Samsung. A Google spokesperson declined to say whether AI helped design the smartphone chip.
Chipmakers including Nvidia and IBM are also dabbling in AI-driven chip design. Other makers of chip-design software, including Cadence, a competitor to Synopsys, are also developing AI tools to aid with mapping out the blueprints for a new chip.
Mike Demler, a senior analyst at the Linley Group who tracks chip-design software, says artificial intelligence is well suited to arranging billions of transistors across a chip. “It lends itself to these problems that have gotten massively complex,” he says. “It will just become a standard part of the computational tool kit.”
Using AI tends to be expensive, Demler says, because it requires a lot of cloud computing power to train a powerful algorithm. But he expects it to become more accessible as the cost of computing drops and models become more efficient. He adds that many tasks involved in chip design cannot be automated, so expert designers are still needed.
Modern microprocessors are incredibly complex, featuring multiple components that need to be combined effectively. Sketching out a new chip design normally requires weeks of painstaking effort as well as decades of experience. The best chip designers employ an instinctive understanding of how different decisions will affect each step of the design process. That understanding cannot easily be written into computer code, but some of the same skill can be captured using machine learning.
The AI approach used by Synopsys, as well as by Google, Nvidia, and IBM, uses a machine-learning technique called reinforcement learning to work out the design of a chip. Reinforcement learning involves training an algorithm to perform a task through reward or punishment, and it has proven an effective way of capturing subtle and hard-to-codify human judgment.
The method can automatically draw up the basics of a design, including the placement of components and how to wire them together, by trying different designs in simulation and learning which ones produce the best results. This can speed the process of designing a chip and allow an engineer to experiment with novel designs more efficiently. In a June blog post, Synopsys said one North American manufacturer of integrated circuits had improved the performance of a chip by 15 percent using the software.
Most famously, reinforcement learning was used by DeepMind, a Google subsidiary, in 2016 to develop AlphaGo, a program capable of mastering the board game Go well enough to defeat a world-class Go player.
De Geus says his company realized that reinforcement learning could also be useful for chip design. “A bit over a year and a half ago, for the first time, we were able to get the same results as a team of experts would get in multiple months in just a few weeks,” de Geus says. He will present details of the technology and its development at HotChips, a semiconductor technology conference, on August 23.
Stelios Diamantidis, senior director of artificial intelligence solutions at Synopsys, says the DSO.ai software can be configured to prioritize different goals, such as performance or energy efficiency.
Semiconductors, as well as the tools used to make them, have become increasingly prized assets. The US government has sought to restrict the supply of chipmaking technology to China, a key rival, and some politicians have called for software to be added to the export controls list.
The emerging era of AI-designed chips also raises the prospect of simultaneously using AI to customize software to run more efficiently on a chip. This might include the neural network algorithms that run on specialized AI chips and are commonly used in modern AI.
“AI-powered codesign of software and hardware is a rapidly growing direction,” says Song Han, a professor at MIT who specializes in AI chip design. “We have seen promising results.”
This story originally appeared on wired.com.
You must login or create an account to comment.
Join the Ars Orbital Transmission mailing list to get weekly updates delivered to your inbox.
WIRED Media Group
Your California Privacy Rights | Do Not Sell My Personal Information
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast.