Anandtech: "Using AI to Build Better Processors: Google Was Just the Start, Says Synopsys" – AnandTech
In an exclusive to AnandTech, we spoke with Synopsys CEO Aart de Geus ahead of a pair of keynote presentations at two upcoming technical semiconductor industry events this year. Synopsys reached out to give us an overview of the key topic of the day, of the year:as part of these talks, Aart will discuss what was considered impossible only a few years ago the path to finding a better and automated way into chip design through the use of machine learning solutions. Within the context of EDA tools,as Google has demonstrated recently, engineers can be assisted in building better processors using machine learning algorithms.
If you read mainstream columns about technology and growth today, there is an eminent focus on the concepts of big data, artificial intelligence, and the value of analyzing that data. With enough data that has been analyzed effectively, companies have shown that they are proactive to customers,predict their needs in advance, or identify trends and react before a human has even seen the data. The more data you have analyzed, the better your actions or reactions can be. This has meant that analyzing the amount of data itself has intrinsic value, as well as the speed at which it is processed. This has causedan explosion of the demand for better analysis tools but also an explosion in data creation itself. Many senior figures in technology and business see the intersection and development of machine learning data analysis tools to churn through that data as the mark of the next generation of economics.
Graph showing manufacturing growth of key silicon product lines since 2016at TSMC, the world's largest contract manufacturer
The desire to have the best solution is accelerating the development of better utilities, but at the same time, the need to deploy it at scale is creating immense demand for resources. All the while, a number of critics are forecasting that Moores Law, a 1960s observation around the exponential development of complex computing that has held true for 50 years, is reaching its end. Othersare busy helping it to stay on track. As driving performance requires innovation on multiple levels, including hardware and software, the need to optimize every abstraction layer to continue that exponential growth has become more complex, more expensive, and requires a fundamental economic gain to those involved to continue investment.
One of the ways in driving performance on the hardware side is in designing processors to work faster and more efficiently. Two processors with the same fundamental building blocks can have those blocks placed in many different orientations, with some arrangements beneficial for power, others for performance, or perhaps for design area, while some configurations make no sense whatsoever. Finding the best combination in light of the economics at the time is often crucial to the competitiveness of the product and the buoyancy of the company that relies on the success of that product. The semiconductor industry is rare in that most chip design companies effectively bet the entire company on the success of the next generation, which makes every generation's design more important than the last.
In light of the rate of innovation, chip design teams have spent tens of thousands of hours honing their skills over decades. But we are at a stage where a modern complex processor has billions of transistors and millions of building blocks to put together in something the size of a toenail. These teams use their expertise, intuition, and nous to place these units in the best configuration, and it gets simulated over the course of 72 hours. The results that come through are analyzed, the design goes back to be updated, and the process repeats. Getting the best human-designed processor in this fashion can take six months or more, because the number of arrangements possible is equivalent to the number of atoms in the known universe risen to the power of the number of atoms in the known universe. With numbers so large, using computers to brute force the best configuration is impossible. At least, it was thought to be.
Work from Google was recently published in the scientific journal Nature about how the company is already using custom AI tools to develop better silicon, which in turn helps develop better custom AI tools. In the research paper, the company applied machine learning algorithms to find the best combination of power, performance, and die area for a number of test designs.
In order to reduce the complexity of the problem, Google limited its scope to certain layers within the design. Take, for example, an electrical circuit that is designed to add numbers together - in Googles work, rather than try and find the best way to build a circuit like this every time, they took a good adder design as a fundamental building block of the problem, mapped how it interacts with other different fundamental blocks, and then the AI software found the best way to build these fundamental blocks. This cuts down the number of different configurations needed, but the problem is still a difficult one to crack, as these blocks will interact with other blocks to varying degrees based on proximity, connections, and electrical/thermal interactions. The nature of the work always depends on what level of abstraction these different building blocks take, and how complex/basic you make them.
Simple 8-stage example of block placement and routing affects the design choices
In Googles paper, the company states that their tools have already been put to use in helping design four parts of an upcoming Google TPU processor designed for machine learning acceleration. While the paper showcases that AI tools werent used across the whole processor, it is taking some of the work that used to be painstaking in engineer labor hours and accelerating the process through computation. The beauty of this application is that the way these building blocks can be put together can scale, and companies like Google can use their datacenters to test thousands of configurations in a single day, rather than having a group of engineers provide a handful of options after several months.
Googles approach also details the effect of using optimized machine learning (so algorithms that have learned how to be better by examining previous designs) against fresh machine learning (algorithms with only a basic understanding that learn from their own trial and error). Both these areas are important, showcasing that in some circumstances, the algorithms do not need to be pre-trained but can still deliver a better-than-human result. That result still requires additional validation for effectiveness, and the results are fed back into the software team to create better algorithms.
But this is just the tip of the iceberg, according to Synopsys CEO Aart de Geus, whose company's software helps develop more silicon processing intellectual property in the industry today than anyone else. Synopsys has been involved in silicon design for over 35+ years, with hundreds of customers, and its latest AI-acceleratedproduct is already in use at a number of high-profile silicon design teams making processors today to help accelerate time to market with a better semiconductor placement than humans can achieve.
Synopsys is a company that makes EDA tools, or Electronic Design Automation, and every semiconductor company in the industry, both old and new, relies on some form of EDA to actually bring silicon to market. EDA tools allow semiconductor designers to effectively write code that describes what they are trying to make, and that can be simulated to sufficient accuracy to tell the designer if it fits within strict parameters, meets the requirements for the final manufacturing, or if it has thermal problems, or perhaps signal integrity does not meet required specifications for a given standard.
EDA tools also rely on abstraction, decades of algorithm development, and as the industry is moving to multi-chip designs and complex packaging technologies, the software teams behind these tools have to be quick to adapt to an ever-changing landscape. Having relied on complex non-linear algorithm solutions to assist designers to date, the computational requirements of EDA tools are quite substantial, and often not scalable. Thus, ultimately any significant improvement to EDA tool design is a welcome beacon in this market.
For context, the EDA tools market has two main competitors, with a combined market cap of $80B and a combined annual revenue of $6.5B. All the major foundries work with these two EDA vendors, and it is actively encouraged to stay within these toolchains, rather than to spin your own, to maintain compatibility.
Synopsys CEO Aart de Geus is set to take the keynote presentations at two upcoming technical semiconductor industry events this year: ISSCC and Hot Chips. As part of these talks, Aart will discuss what was considered impossible only a few years ago the path to finding a better and automated way into chip design through the use of machine learning solutions. Within the context of EDA tools, as Google has demonstrated publicly, engineers can be assisted in building better processors, or similarly not so many engineers are needed to build a good processor. To this point, Aarts talk at Hot Chips will be titled:
Does Artificial Intelligence Require Artificial Architects?
I spent about an hour speaking with Aart on this topic and what it means to the wider industry. The discussion would have made a great interview on the topic, although unfortunately this was just an informal discussion! But in our conversation, aside from the simple fact that machine learning can help silicon design teams optimize more variations with better performance in a fraction of the time, Aart was clear that the fundamental drive and idea of Moores Law, regardless of the exact way you want to interpret what Gordon Moore actually said, is still driving the industry forward in very much the same way that is has been the past 50 years. The difference is now that machine learning, as a cultural and industrial revolution, is enabling emergent compute architectures and designs leading to a new wave of complexity, dubbed systemic complexity.
Aart also presented to me the factual way how the semiconductor industry has evolved. At each stage of fundamental improvement, whether thats manufacturing improvement through process node lithography such as EUV or transistor architectures like FinFET or Gate-All-Around, or topical architecture innovation for different silicon structures such as high performance compute or radio frequency, we have been relying on architects and research to enable those step-function improvements. In a new era of machine learning assisted design, such as the tip of the iceberg presented by Google, new levels of innovation can emerge, albeit with a new level of complexity on top.
Aart described that with every major leap, such as moving from 200mm to 300mm wafers, or planar to FinFET transistors, or from DUV to EUV, it all relies on economics no one company can make the jump without the rest of the industry coming along and scaling costs. Aart sees the use of machine learning in chip design, for use at multiple abstraction layers, will become a de-facto benefit that companies will use as a result of the current economic situation the need to have the most optimized silicon layout for the use case required. Being able to produce 100 different configurations overnight, rather than once every few days, is expected to revolutionize how computer chips are made in this decade.
The era of AI accelerated chip design is going to be exciting. Hard work, but very exciting.
From Synopsys point of view, the goal of introducing Aart to me and having the ability to listen to his view and ask questions was to give me a flavor ahead of his Hot Chips talk in August. Synopsys has some very exciting graphs to show, one of which they have provided to me in advance below,on how its own DSO.ai software is tackling these emerging design complexities. The concepts apply to all areas of EDA tools, but this being a business, Synopsys clearly wants to show how much progress it has made in this area and what benefits it can bring to the wider industry.
In this graph, we are plotting power against wire delay. The best way to look at this graph is to start at the labeled point at the top, which says Start Point.
All of the small blue points indicate one full AI sweep of placing the blocks in the design. Over 24 hours, the resources in this test showcase over 100 different results, with the machine learning algorithm understanding what goes where with each iteration. The end result is something well beyond what the customer requires, giving them a better product.
There is a fifth point here that isn't labeled, and that is the purple dots that represent even better results. This comes from the DSO algorithm on a pre-trained network specifically for this purpose. The benefit here is that in the right circumstances, even a better result can be achieved. But even then, an untrained network can get almost to that pointas well, indicated by the best untrained DSO result.
Synopsys has already made some disclosures with customers, such as Samsung. Across four design projects, time to design optimization was reduced by 86%, from a month do days, using up to 80% fewer resources and often beating human-led design targets.
I did come away with several more questions that I hope Aart will address when the time comes.
Firstly I would like to address where the roadmaps lie in improving machine learning in chip design. It is one thing to make the algorithm that finds a potentially good result and then to scale it and produce 100s or 1000s of different configurations overnight, but is there an artificial maximum of what can be considered best, limited perhaps by the nature of the algorithm being used?
Second, Aart and I discussed Googles competition with Go Master and 18-time world champion Lee Sedol, in which Google beat the worlds best Go player 4-1 in a board game that was considered impossible only five years prior for computers to come close to the best humans. In that competition, both the Google DeepMind AI and the human player made a 1-in-10000 move, which is rare in an individual game, but one might argue is more likely to occur in human interactions. My question to Aart is whether machine learning for chip design will ever experience those 1-in-10000 moments, or rather in more technical terms, would the software still be able to find a best global minimum if it gets stuck in a local minimum over such a large (1 in 102500 combinations for chip design vs 1 in 10230 in Go) search space.
Third, and perhaps more importantly, is how applying machine learning at different levels of the design can violate those layers. Most modern processor design relies on specific standard cells and pre-defined blocks there will be situations where modified versions of those blocks might be better in some design scenarios when coupled close to different parts of the design. With all of these elements interacting with each other and having variable interaction effects, the complexity is in managing these interactions within the machine learning algorithms in a time-efficient way, but how these tradeoffs are made is still a point to prove.
In my recent interview with Jim Keller, I asked him if at one point we will see silicon design look unfathomable to even the best engineers he said Yeah, and its coming pretty fast. It is one thing to talk holistically about what AI can bring to the world, but its another to have it working in action to improve semiconductor design and providing a fundamental benefit at the base level of all silicon. Im looking forward to further disclosures on AI-accelerated silicon design from Synopsys, its competitors, and hopefully some insights from those that are using it to design their processors.
Go here to read the rest:
Anandtech: "Using AI to Build Better Processors: Google Was Just the Start, Says Synopsys" - AnandTech
- Schwarzenegger urges Californians to oppose Democratic redistricting ballot measure, as GOP presses on in other states - CNN - October 26th, 2025 [October 26th, 2025]
- Trump says hes targeting Democrats programs, but the suffering is bipartisan - The Hill - October 26th, 2025 [October 26th, 2025]
- Analysis | After Trump gains, New Jersey governors race offers a test for Democrats - The Washington Post - October 26th, 2025 [October 26th, 2025]
- Trump looms over 2025 races in Virginia, New Jersey, NYC and California - USA Today - October 26th, 2025 [October 26th, 2025]
- Opinion | How Democrats Became the Party of the Well-to-Do - The New York Times - October 26th, 2025 [October 26th, 2025]
- Transcript: House Minority Leader Hakeem Jeffries on "Face the Nation with Margaret Brennan," Oct. 26, 2025 - CBS News - October 26th, 2025 [October 26th, 2025]
- 'King-like powers': Chris Murphy says Trump prefers the government to remain closed - Politico - October 26th, 2025 [October 26th, 2025]
- On GPS: Is the future of the Democratic Party on the left? - CNN - October 26th, 2025 [October 26th, 2025]
- Elect the Jersey guy: How Jack Ciattarelli is trying to erase Democrats advantage in a crucial governors race - CNN - October 26th, 2025 [October 26th, 2025]
- Can Democrats harness the energy of the No Kings protests to fight Trump? - The Guardian - October 26th, 2025 [October 26th, 2025]
- Democrats face identity crisis after years of losing touch with voters - Deseret News - October 26th, 2025 [October 26th, 2025]
- Meet the candidates in the special election for Texas Senate District 9 - CBS News - October 26th, 2025 [October 26th, 2025]
- New Georgia Democratic Party leader, government shutdown, NBA gambling | On The Record with ANF - Atlanta News First - October 26th, 2025 [October 26th, 2025]
- Expert warns Democrats risk backlash over failure to condemn violent rhetoric in their ranks - Fox News - October 26th, 2025 [October 26th, 2025]
- I hate to be the one to tell you, but Democrats are starting to like Trump | Opinion - USA Today - October 26th, 2025 [October 26th, 2025]
- Why has the US government shut down and what does it mean? - BBC - October 26th, 2025 [October 26th, 2025]
- Article | Virginia Democrats are the next surprising entrant into the redistricting battle - POLITICO Pro - October 26th, 2025 [October 26th, 2025]
- Could she be Democrats' greatest Hope? Meet Tim Walz's TikTok famous daughter. - USA Today - October 26th, 2025 [October 26th, 2025]
- Democrats Join With Trump in the Death of Democracy - GV Wire - October 26th, 2025 [October 26th, 2025]
- Opinion | The exploding cigar of mid-decade gerrymandering - The Washington Post - October 26th, 2025 [October 26th, 2025]
- Minnesota Democrats hold the first of a series of town halls on gun violence - MPR News - October 26th, 2025 [October 26th, 2025]
- South Korean Go champion defeats AlphaGo for the first time in a comeback victory - Mashdigi - September 25th, 2025 [September 25th, 2025]
- Why AlphaGo, not ChatGPT, will shape the future of wealth management - Professional Wealth Management - September 17th, 2025 [September 17th, 2025]
- The world shuddered when Lee Se-dol made a "God's move" against AlphaGo in 2016. The final result wa.. - - August 26th, 2025 [August 26th, 2025]
- The Go Summit concluded with AlphaGo 2.0 defeating the human brain in three matches. - Mashdigi - August 22nd, 2025 [August 22nd, 2025]
- Lee Sedol showcases board game success and family life on 'Radio Star' - CHOSUNBIZ - Chosun Biz - August 20th, 2025 [August 20th, 2025]
- AlphaGo evolved again and in just three days learned the human Go strategy that took thousands of years to develop. - Mashdigi - August 18th, 2025 [August 18th, 2025]
- In the third round of the Man vs. Machine game, a five-player team still lost to AlphaGo 5. - Mashdigi - August 18th, 2025 [August 18th, 2025]
- AlphaGo defeated Lee Sedol 4:1 to end the century showdown - Mashdigi - August 18th, 2025 [August 18th, 2025]
- Google: The key to AlphaGo 2.0's fast thinking lies in the TensorFlow learning framework - Mashdigi - August 18th, 2025 [August 18th, 2025]
- World Go champion Ke Jie faces AlphaGo 2.0 in the showdown of the century tomorrow. - Mashdigi - August 18th, 2025 [August 18th, 2025]
- Lee Se-dol, a Go engineer who played a great match with "AlphaGo" with Lee Kuk-jong, the head of the.. - - August 14th, 2025 [August 14th, 2025]
- The Rise of Self-Improving AI : How Machines Are Redefining Innovation - Geeky Gadgets - August 6th, 2025 [August 6th, 2025]
- AI Wins Gold Medal at International Mathematical Olympiad (IMO), but "AlphaGo Moment" in Math Community Yet to Arrive - 36Kr - August 1st, 2025 [August 1st, 2025]
- It's exciting, but you can't just read it comfortably. This is the story of Jang Kang-myung's latest.. - - July 20th, 2025 [July 20th, 2025]
- Google's AlphaGo retires from competition after beating world number one 3 - 0 - HardwareZone Singapore - June 29th, 2025 [June 29th, 2025]
- Google's AlphaGo AI just beat the number one ranked Go player in the world - HardwareZone Singapore - June 29th, 2025 [June 29th, 2025]
- It was November 2015. There were two world competitions. It was four months before AlphaGo, made by - - June 22nd, 2025 [June 22nd, 2025]
- The rise of Generative AI: from AlphaGo to ChatGPT - imd.org - June 1st, 2025 [June 1st, 2025]
- With the effect of Lee Se-dol, a former Go player who beat AlphaGo, "Devils Plan 2" became the secon.. - - May 14th, 2025 [May 14th, 2025]
- Chinese teams AI paper paved the way for ChatGPT. Greater glory awaits by 2030 - South China Morning Post - April 21st, 2025 [April 21st, 2025]
- AI scholars win Turing Prize for technique that made possible AlphaGo's chess triumph - ZDNet - March 9th, 2025 [March 9th, 2025]
- The evolution of AI: From AlphaGo to AI agents, physical AI, and beyond - MIT Technology Review - March 1st, 2025 [March 1st, 2025]
- AlphaGo led Lee 4-1 in March 2016. One round Lee Se-dol won remains the last round in which a man be.. - - December 5th, 2024 [December 5th, 2024]
- Koreans picked Google Artificial Intelligence (AI) AlphaGo as an image that comes to mind when they .. - MK - - March 16th, 2024 [March 16th, 2024]
- DeepMind AI rivals the world's smartest high schoolers at geometry - Ars Technica - January 20th, 2024 [January 20th, 2024]
- Why top AI talent is leaving Google's DeepMind - Sifted - November 20th, 2023 [November 20th, 2023]
- Who Is Ilya Sutskever, Meet The Man Who Fired Sam Altman - Dataconomy - November 20th, 2023 [November 20th, 2023]
- Microsoft's LLM 'Everything Of Thought' Method Improves AI ... - AiThority - November 20th, 2023 [November 20th, 2023]
- Absolutely, here's an article on the impact of upcoming technology - Medium - November 20th, 2023 [November 20th, 2023]
- AI: Elon Musk and xAI | Formtek Blog - Formtek Blog - November 20th, 2023 [November 20th, 2023]
- Rise of the Machines Exploring the Fascinating Landscape of ... - TechiExpert.com - November 20th, 2023 [November 20th, 2023]
- What can the current EU AI approach do to overcome the challenges ... - Modern Diplomacy - November 20th, 2023 [November 20th, 2023]
- If I had to pick one AI tool... this would be it. - Exponential View - November 20th, 2023 [November 20th, 2023]
- For the first time, AI produces better weather predictions -- and it's ... - ZME Science - November 20th, 2023 [November 20th, 2023]
- Understanding the World of Artificial Intelligence: A Comprehensive ... - Medium - October 17th, 2023 [October 17th, 2023]
- On AI and the soul-stirring char siu rice - asianews.network - October 17th, 2023 [October 17th, 2023]
- Nvidias Text-to-3D AI Tool Debuts While Its Hardware Business Hits Regulatory Headwinds - Decrypt - October 17th, 2023 [October 17th, 2023]
- One step closer to the Matrix: AI defeats human champion in Street ... - TechRadar - October 17th, 2023 [October 17th, 2023]
- The Vanishing Frontier - The American Conservative - October 17th, 2023 [October 17th, 2023]
- Alphabet: The complete guide to Google's parent company - Android Police - October 17th, 2023 [October 17th, 2023]
- How AI and ML Can Drive Sustainable Revenue Growth by Waleed ... - Digital Journal - October 9th, 2023 [October 9th, 2023]
- The better the AI gets, the harder it is to ignore - BSA bureau - October 9th, 2023 [October 9th, 2023]
- What If the Robots Were Very Nice While They Took Over the World? - WIRED - September 27th, 2023 [September 27th, 2023]
- From Draughts to DeepMind (Scary Smart) | by Sud Alogu | Aug, 2023 - Medium - August 5th, 2023 [August 5th, 2023]
- The Future of Competitive Gaming: AI Game Playing AI - Fagen wasanni - August 5th, 2023 [August 5th, 2023]
- AI's Transformative Impact on Industries - Fagen wasanni - August 5th, 2023 [August 5th, 2023]
- Analyzing the impact of AI in anesthesiology - INDIAai - August 5th, 2023 [August 5th, 2023]
- Economic potential of generative AI - McKinsey - June 20th, 2023 [June 20th, 2023]
- The Intersection of Reinforcement Learning and Deep Learning - CityLife - June 20th, 2023 [June 20th, 2023]
- Chinese AI Giant SenseTime Unveils USD559 Robot That Can Play ... - Yicai Global - June 20th, 2023 [June 20th, 2023]
- Cyber attacks on AI a problem for the future - Verdict - June 20th, 2023 [June 20th, 2023]
- Taming AI to the benefit of humans - Asia News NetworkAsia News ... - asianews.network - May 20th, 2023 [May 20th, 2023]
- Evolutionary reinforcement learning promises further advances in ... - EurekAlert - May 20th, 2023 [May 20th, 2023]
- Commentary: AI's successes - and problems - stem from our own ... - CNA - May 20th, 2023 [May 20th, 2023]
- Machine anxiety: How to reduce confusion and fear about AI technology - Thaiger - May 20th, 2023 [May 20th, 2023]
- We need more than ChatGPT to have true AI. It is merely the first ingredient in a complex recipe - Freethink - May 20th, 2023 [May 20th, 2023]
- Taming AI to the benefit of humans - Opinion - Chinadaily.com.cn - China Daily - May 16th, 2023 [May 16th, 2023]
- To understand AI's problems look at the shortcuts taken to create it - EastMojo - May 16th, 2023 [May 16th, 2023]
- Terence Tao Leads White House's Generative AI Working Group ... - Pandaily - May 16th, 2023 [May 16th, 2023]