ANL Special Colloquium on The Future of Computing – HPCwire
There are, of course, a myriad of ideas regarding computings future. At yesterdays Argonne National Laboratorys Directors Special Colloquium, The Future of Computing, guest speaker Sadasivan Shankar, did his best to convince the audience that the high-energy cost of the current computing paradigm not (just) economic cost; were talking entropy here is fundamentally undermining computings progress such that it will never be able to solve todays biggest challenges.
The broad idea is that the steady abstracting away of informational content from each piece of modern computings complicated assemblage (chips, architecture, programming) inexorably increases the cumulative energy cost, leading toward a hard ceiling. Leaving aside, for a moment, the decline in Moores law (just a symptom really), it is the separation (abstraction) of information from direct computation thats the culprit argues Shankar. Every added step adds energy cost.
Nature, on the other hand, bakes information into things. Consider, said Shankar, how a string of amino acids folds into its intended 3-D conformation on a tiny energy budget and in a very short time just by interacting with its environment, and contrast that with the amount of compute required i.e. energy expended to accurately predict protein folding from a sequence of amino acids. Shankar, research technology manager at the SLAC National Laboratory and adjunct Stanford professor, argues computing must take a lesson from nature and strive to pack information more tightly into applications and compute infrastructure.
Information theory is a rich field with a history of rich debate. Turning theory into practice has often proven more difficult and messy. Shankar (and his colleagues) have been developing a formal framework for classifying the levels of information content in human-made computation schemes and natural systems in a way that permits direct comparison between the two. The resulting scale has eight classification levels (0-7).
Theres a lot to digest in Shankars talk. Rather than going off the rails here with a garbled explanation its worth noting that Argonne has archived the video and Shankar has a far-along paper thats expected in a couple of months. No doubt some of his ideas will stir conversation. Given that Argonne will be home to Aurora, the exascale supercomputer now being built at the lab, it was an appropriate site for a talk on the future of computing.
Before jumping into what the future may hold, heres a quick summary of Shankars two driving points 1) Moores law, or more properly the architecture and semiconductor technology on which it rests, is limited and 2) the growing absolute energy cost of information processing using traditional methods (von Neumann) are limiting:
A big part of the answer to question of how computing must progress, suggested Shankar, is to take a page from Feynmans reverberating idea not just for quantum computing and emulate the way nature computes, pack[ing] all of the information needed for the computing into the things themselves or at least by reducing abstraction as much as possible.
Argonne assembled an expert panel to bat Shankars ideas around. The panel included moderator Rick Stevens (associate laboratory director and Argonne distinguished fellow), Salman Habib (director, Argonne computational science division and Argonne distinguished fellow), Yanjing Li (assistant professor, department of computer science, University of Chicago), and Fangfang Xia (computer scientist, data science and learning division, ANL).
Few quibbled with the high-energy cost of computing as described by Shankar but they had a variety of perspectives on moving forward. One of the more intriguing comments came from Xia, an expert in neuromorphic computing. He suggested using neuromorphic systems to discover new algorithms is a potentially productive approach.
My answer goes back to the earlier point Sadas and Rick made which is, if were throwing away efficiency in the information power conversion process, why dont we stay with biological system for a bit longer. Theres this interesting field called synthetic biological intelligence. They are trying to do these brain-computer interfaces, not in a Neurolink way, because thats still shrouded in uncertainty. But there is a company and they grow these brain cells in a petri dish. Then they connect this to an Atari Pong game. And you can see that after just 10 minutes, these brain cells self-organize into neural networks, and they can learn to play the game, said Xia.
Keep in mind, this is 10 minutes in real life, its not a simulation time. Its only dozens of games, just like how we pick up games. So this data efficiency is enormous. What I find particularly fascinating about this is that in this experiment there was no optimization goal. There is no loss function you have to tweak. The system, when connected in this closed loop fashion, will just learn in an embodied way. That opens so many possibilities, you think about all these dishes, just consuming glucose, you can have them to learn latent representations, maybe to be used in digital models.
Li, a computer architecture expert, noted that general purpose computing infrastructure has existed for a long time.
I remember this is the same architecture of processor design I learned at school, and I still teach the same materials today. For the most part, when were trying to understand how CPUs work, and even some of the GPUs, those have been around for a long time. I dont think there has been a lot of very revolutionary kind of changes for those architectures. Theres a reason for that, because we have developed, good tool chains, the compiler tool change people are educated to understand and program and build those systems. So anytime we want to make a big change [it has] to be competitive and as usable as what we know of today, Li said.
On balance, she expects more incremental changes. I think its not going to be just a big jump and well get there tomorrow. We have to build on small steps looking at building on existing understanding and also evolving along with the application requirements. I do think that there will be places where we can increase energy efficiency. If were looking at the memory hierarchy, for example, we know caches and that it helps us with performance. But its also super inefficient from an energy performance standpoint. But this has worked for a long time, because traditional applications have good locality, but we are increasingly seeing new applications where [there] may not be as many localities so theres a way for innovation in the memory hierarchy path. For example, we can design different memory, kind of reference patterns and infrastructures or applications that do not activate locality, for example. That will be one way of making the whole computing system much more efficient.
Li noted the trend toward specialized computing was another promising approach: If we use a general-purpose computing system like a CPU, theres overhead that goes into fetching the instructions, decoding them. All of those are overheads are not directly solving the problem, but its just what you need to get the generality you need to solve all problems. Increasing specialization towards offloading different specialized tasks would be another kind of interesting perspective of approaching this problem.
There was an interesting exchange between Shankar and Stevens over the large amount of energy consumed in training todays large natural language processing models.
Shankar said, Im quoting from literature on deep neural networks or any of these image recognition networks. They scale quadratically with the number of data points. One of the latest things that is being hyped about in the last few weeks is a trillion parameter, natural language processing [model]. So here are the numbers. To train one of those models, it takes the energy equivalent to four cars being driven a whole year, just to train the model, including the manufacturing cost of the car. That is how much energy is spent in the training on this, so there is a real problem, right?
Not so fast countered Stevens. Consider using the same numbers for how much energy is going into Bitcoin, right? So the estimate is maybe something like 5 percent of global energy production. At least these neural network models are useful. Theyre not just used for natural language processing. You can use it for distilling knowledge. You can use them for imaging and so forth. I want to shift gears a little bit. Governments around the world and VCs are putting a lot of money into quantum computing, and based on what you were talking about, its not clear to me that thats actually the right thing we should be doing. We have lots of opportunities for alternative computing models, alternative architectures that could open up spaces that we know in principle can work. We have classical systems that can do this, he said.
Today, theres an army of computational scientists around the world seeking ways to advance computing, some of them focused on the energy aspect of the problem, others focused on other areas such on performance or capacity. It will be interesting to see if the framework and methodology embodied on Shankars forthcoming paper not only provokes discussion but also provides a concrete methodology for comparing computing system efficiency.
Link to ANL video: https://vimeo.com/event/2081535/17d0367863
Brief Shankar Bio
Sadasivan (Sadas) Shankar is Research Technology Manager at SLAC National Laboratory and Adjunct Professor in Stanford Materials Science and Engineering. He is also an Associate in the Department of Physics in Harvard Faculty of Arts and Sciences, and was the first Margaret and Will Hearst Visiting Lecturer in Harvard University and the first Distinguished Scientist in Residence at the Harvard Institute of Applied Computational Sciences. He has co-instructed classes related to materials, computing, and sustainability and was awarded Harvard University Teaching Excellence Award. He is involved in research in materials, chemistry, and specialized AI methods for complex problems in physical and natural sciences, and new frameworks for studying computing. He is a co-founder and the Chief Scientist in Material Alchemy, a last mile translational and independent venture for sustainable design of materials.
Dr. Shankar was a Senior Fellow in UCLA-IPAM during a program on Machine Learning and Many-body Physics, invited speaker in The Camille and Henry Dreyfus Foundation on application of Machine Learning for chemistry and materials, Carnegie Science Foundation panelist for Brain and Computing, National Academies speaker on Revolutions in Manufacturing through Mathematics, invited to White House event for Materials Genome, Visiting Lecturer in Kavli Institute of Theoretical Physics in UC-SB, and the first Intel Distinguished Lecturer in Caltech and MIT. He has given several colloquia and lectures in universities all over the world. Dr. Shankar also worked in the semiconductor industry in the areas of materials, reliability, processing, manufacturing, and is a co-inventor in over twenty patent filings. His work was also featured in the journal Science and as a TED talk.
Go here to read the rest:
ANL Special Colloquium on The Future of Computing - HPCwire
- Microsofts Quantum Computing Breakthrough, Explained - The Dispatch - March 5th, 2025 [March 5th, 2025]
- Quantum Computing Startup Says Its Already Making Millions of Light-Powered Chips - Singularity Hub - March 5th, 2025 [March 5th, 2025]
- Quantum computing is creating the future heres how - USC Dornsife College of Letters, Arts and Sciences - March 5th, 2025 [March 5th, 2025]
- Why We Dont Have Real Quantum Computing Yet - Forbes - March 5th, 2025 [March 5th, 2025]
- QunaSys Joins 19.95M ($20.91M USD) EU Project to Advance Sustainable Battery Innovation with Quantum Computing - Quantum Computing Report - March 5th, 2025 [March 5th, 2025]
- Alice & Bob to Host Fault-Tolerant Quantum Computing Workshop with CEA - HPCwire - March 5th, 2025 [March 5th, 2025]
- Rigetti partners with Quanta to boost superconducting quantum computing development - DatacenterDynamics - March 5th, 2025 [March 5th, 2025]
- Quantum Computing Inc. Class Action Alert: Wolf Haldenstein Adler Freeman & Herz LLP reminds investors that a securities class action lawsuit has... - March 5th, 2025 [March 5th, 2025]
- Quantum computing startup says its already making millions of light-powered chips - StartupNews.fyi - March 5th, 2025 [March 5th, 2025]
- A quantum computing startup says it is already making millions of light-powered chips - Phys.org - March 3rd, 2025 [March 3rd, 2025]
- Superconducting Quantum Computing Beyond 100 Qubits - Physics - March 3rd, 2025 [March 3rd, 2025]
- How IBM CEO Arvind Krishna Is Thinking About AI and Quantum Computing - TIME - March 3rd, 2025 [March 3rd, 2025]
- Webinar | 27 March 2025 | Quantum computing: The future of finance are you ready for Q-Day? - FinTech Futures - March 3rd, 2025 [March 3rd, 2025]
- 3 Quantum Computing Stocks To Buy As Microsoft Announces Major Breakthrough - Barchart - March 3rd, 2025 [March 3rd, 2025]
- WT 360: Inside the governments quantum computing push - Washington Technology - March 3rd, 2025 [March 3rd, 2025]
- INVESTOR ALERT: Pomerantz Law Firm Announces the Filing of a Class Action Against Quantum Computing Inc. and Certain Officers - QUBT - PR Newswire - March 3rd, 2025 [March 3rd, 2025]
- Amazon unveils Ocelot, its first quantum computing chip - The Guardian - March 3rd, 2025 [March 3rd, 2025]
- Industry Weighs in on AWS Quantum Computing Chip - IoT World Today - March 3rd, 2025 [March 3rd, 2025]
- Startup PsiQuantum says it is making millions of quantum computing chips - Yahoo - March 1st, 2025 [March 1st, 2025]
- IonQs Earnings Hit the Stock. Quantum Computing Rivals D-Wave and Rigetti Are Down Too. - Barron's - March 1st, 2025 [March 1st, 2025]
- Interested in Quantum Computing Investing? Here Are 4 Fantastic Picks to Maximize Your Odds of Picking a Winner - Nasdaq - March 1st, 2025 [March 1st, 2025]
- Want to Invest in Quantum Computing? 2 Stocks That Are Great Buys Right Now - The Motley Fool - March 1st, 2025 [March 1st, 2025]
- 3 Reasons Why Microsoft Is the New King of Quantum Computing With Majorana 1 - The Motley Fool - March 1st, 2025 [March 1st, 2025]
- QUBT INVESTOR ALERT: Bronstein, Gewirtz & Grossman LLC Announces that Quantum Computing Inc. Investors with Substantial Losses Have Opportunity to... - March 1st, 2025 [March 1st, 2025]
- Quantum Computing (NASDAQ:QUBT) Trading 0.4% Higher - Here's What Happened - MarketBeat - March 1st, 2025 [March 1st, 2025]
- Why Quantum Computing Stock IonQ Dropped Today - The Motley Fool - March 1st, 2025 [March 1st, 2025]
- Prediction: These 2 Quantum Computing Stocks Will Be the Biggest AI Winners of 2025 - Yahoo Finance - February 20th, 2025 [February 20th, 2025]
- 4 AI Stocks to Watch in the Quantum Computing Revolution - The Motley Fool - February 20th, 2025 [February 20th, 2025]
- Quantum Watch: 3 Quantum Computing Startups Set to Disrupt the Industry - TipRanks - February 20th, 2025 [February 20th, 2025]
- D-Wave, IonQ and Quantum Computing Stocks Pop: What's Driving the Momentum? - Benzinga - February 20th, 2025 [February 20th, 2025]
- Microsoft quantum breakthrough promises to usher in the next era of computing in 'years, not decades' - GeekWire - February 20th, 2025 [February 20th, 2025]
- Microsoft claims practical quantum computing could be ready in 'years rather than decades' with new computer chip - Fortune - February 20th, 2025 [February 20th, 2025]
- Microsoft unveils chip it says could bring quantum computing within years - The Guardian - February 20th, 2025 [February 20th, 2025]
- Microsoft created a new type of matter for its quantum computing chip - Quartz - February 20th, 2025 [February 20th, 2025]
- Kipu Quantum and IBM Introduce New Optimization Function in Qiskit Functions Catalog - Quantum Computing Report - February 20th, 2025 [February 20th, 2025]
- Microsoft reveals its first quantum computing chip, the Majorana 1 - MSN - February 20th, 2025 [February 20th, 2025]
- How Microsoft is rewriting the rules of reality with quantum computing - Interesting Engineering - February 20th, 2025 [February 20th, 2025]
- Microsoft Makes Quantum Computing Breakthrough With New Chip - The New Stack - February 20th, 2025 [February 20th, 2025]
- Should the Government Fund a Manhattan Project for Quantum Computing? - Built In - February 20th, 2025 [February 20th, 2025]
- This Quantum Computing Stock Just Announced a Key New Sales Strategy and Its First Customer - Barchart - February 20th, 2025 [February 20th, 2025]
- HPE launches slew of Xeon-based Proliant servers which claim to be impervious to quantum computing threats - TechRadar - February 20th, 2025 [February 20th, 2025]
- Quantum Computing (NASDAQ:QUBT) Trading Down 4% - Here's What Happened - MarketBeat - February 20th, 2025 [February 20th, 2025]
- 4 AI Stocks to Watch in the Quantum Computing Revolution - MSN - February 20th, 2025 [February 20th, 2025]
- The Next Big Thing in Quantum Computing: 3 Startups to Watch - PUNE.NEWS - February 20th, 2025 [February 20th, 2025]
- Quantum Computing Is Closer Than Ever. Everybodys Too Busy to Pay Attention. - The Wall Street Journal - February 14th, 2025 [February 14th, 2025]
- Practical Quantum Computing Five to Ten Years Away: Google CEO - The Quantum Insider - February 14th, 2025 [February 14th, 2025]
- Oxford scientists say they have achieved teleportation - The Independent - February 14th, 2025 [February 14th, 2025]
- D-Wave Quantum Announces Another Sale. Its a Milestone in Quantum Computing. - Barron's - February 14th, 2025 [February 14th, 2025]
- This Canadian company is out to stop the biggest quantum computing threat - The Logic - February 14th, 2025 [February 14th, 2025]
- QphoX, Rigetti, and Qblox Demonstrate Optical Readout Technique for Superconducting Qubits - Quantum Computing Report - February 14th, 2025 [February 14th, 2025]
- Quantum computing is already here, experts say - DIGITIMES - February 14th, 2025 [February 14th, 2025]
- FS-ISAC Releases Guidance to Help the Payment Card Industry Mitigate Risks of Quantum Computing - The Quantum Insider - February 14th, 2025 [February 14th, 2025]
- Quantum Corporation: Improved Results, But Still Not A Quantum Computing Play - Sell - Seeking Alpha - February 14th, 2025 [February 14th, 2025]
- Why AI firms should follow the example of quantum computing research - New Scientist - February 14th, 2025 [February 14th, 2025]
- Unlocking the Future: IonQ Revolutionizes Quantum Computing at CES 2025! - Jomfruland.net - February 14th, 2025 [February 14th, 2025]
- Billionaire Bill Gates Thinks Quantum Computing Could Be Ready for Prime Time Within 3 to 5 Years. Could Nvidia Be in Trouble If He's Right? - The... - February 14th, 2025 [February 14th, 2025]
- Quantum Computing in 2025: Will the Asia Pacific Continue Its Advancement? - Telecom Review Asia - February 14th, 2025 [February 14th, 2025]
- Is D-Wave the Future of Computing? Discover the Quantum Leap! - Jomfruland.net - February 14th, 2025 [February 14th, 2025]
- Revolutionizing Computing: The Rise of D-Wave! The Future of Quantum Technology - Jomfruland.net - February 14th, 2025 [February 14th, 2025]
- Quantum computing startup OQT announced on the 13th that it has attracted 3 billion won worth of see.. - - February 12th, 2025 [February 12th, 2025]
- 2 Top Quantum Computing Stocks to Buy in 2025 - The Motley Fool - February 12th, 2025 [February 12th, 2025]
- 3 Top-Rated Quantum Computing Stocks To Buy In February 2025 - Barchart - February 12th, 2025 [February 12th, 2025]
- Quantum Computing Breakthrough Brings Us Closer to Universal Simulation - SciTechDaily - February 12th, 2025 [February 12th, 2025]
- Allston quantum computing firm plans to nearly double workforce - The Boston Globe - February 12th, 2025 [February 12th, 2025]
- Quantum Computing: A Beginners Guide to Understanding the Next Revolution - TipRanks - February 12th, 2025 [February 12th, 2025]
- Want to Invest in Quantum Computing? 1 Stock That Is a Great Buy Right Now. - The Motley Fool - February 12th, 2025 [February 12th, 2025]
- 2 Top Quantum Computing Stocks to Buy in February - The Motley Fool - February 12th, 2025 [February 12th, 2025]
- Oxford quantum teleportation breakthrough brings scalable quantum computing closer to reality - Innovation News Network - February 12th, 2025 [February 12th, 2025]
- Preparing for a Quantum Computing Nightmare on the Stock Exchange: What Is Q-Day? - TipRanks - February 12th, 2025 [February 12th, 2025]
- Are Quantum Computing Stocks Worth The Investment? - Seeking Alpha - February 12th, 2025 [February 12th, 2025]
- 7 Best Quantum Computing Stocks to Buy in 2025 | Investing - U.S News & World Report Money - February 12th, 2025 [February 12th, 2025]
- Quantum computing will bring lost Bitcoin 'back in circulation Tether CEO - Cointelegraph - February 12th, 2025 [February 12th, 2025]
- Tether CEO predicts quantum computing could recover lost Bitcoin - crypto.news - February 12th, 2025 [February 12th, 2025]
- Tether CEO Paolo Ardoino Says Quantum Computing Will Allow Hackers To Take Bitcoin From Lost Wallets - The Daily Hodl - February 12th, 2025 [February 12th, 2025]
- Quantum computing wont kill Bitcoin but it might unlock Satoshis wallet, says Tether CEO - DLNews - February 12th, 2025 [February 12th, 2025]
- Partnership Delivers Scalable Quantum Computing with QEC Capability - EE Times - February 7th, 2025 [February 7th, 2025]
- PsiQuantum and Microsoft Selected to Move on to the Final Validation and Co-Design Stage of DARPAs Underexplored Systems for Utility-Scale Quantum... - February 7th, 2025 [February 7th, 2025]
- Google targets commercial quantum computing within five years - Dig Watch Updates - February 7th, 2025 [February 7th, 2025]
- Googles Quantum Computing Chief Challenges Nvidias Jensen Huangs 20-Year Timeline: 'Within Five Years Well See Real-World Applications That Are... - February 7th, 2025 [February 7th, 2025]
- Quantum Leap or Market Mirage? D-Wave Stock and the Future of Computing - Mi Valle - February 7th, 2025 [February 7th, 2025]