Archive for the ‘Quantum Computer’ Category

133qubit Quantum Heron launched by IBM – Electronics Weekly

IBM also unveiled IBM Quantum System Two, the companys first modular quantum computer and cornerstone of IBMs quantum-centric supercomputing architecture. The first IBM Quantum System Two, located in Yorktown Heights, New York, has begun operations with three IBM Heron processors and supporting control electronics

With this critical foundation now in place, along with other breakthroughs in quantum hardware, theory, and software, the company is extending its IBM Quantum Development Roadmap to 2033 with new targets to significantly advance the quality of gate operations. Doing so would increase the size of quantum circuits able to be run and help to realize the full potential of quantum computing at scale.

We are firmly within the era in which quantum computers are being used as a tool to explore new frontiers of science, said Dario Gil, IBM SVP and Director of Research. As we continue to advance how quantum systems can scale and deliver value through modular architectures, we will further increase the quality of a utility-scale quantum technology stack and put it into the hands of our users and partners who will push the boundaries of more complex problems.

As demonstrated by IBM earlier this year on a 127-qubit IBM Quantum Eagle processor, IBM Quantum systems can now serve as a scientific tool to explore utility-scale classes of problems in chemistry, physics, and materials beyond brute force classical simulation of quantum mechanics.

.

IBM Quantum System Two is the foundation of IBMs next generation quantum computing system architecture. It combines scalable cryogenic infrastructure and classical runtime servers with modular qubit control electronics.

The new system is a building block for IBMs vision of quantum-centric supercomputing. This architecture combines quantum communication and computation, assisted by classical computing resources, and leverages a middleware layer to appropriately integrate quantum and classical workflows.

As part of the ten-year IBM Quantum Development Roadmap, IBM plans for this system to also house IBMs future generations of quantum processors. Also, as part of this roadmap, these future processors are intended to gradually improve the quality of operations they can run to significantly extend the complexity and size of workloads they are capable of handling.

IBM is also detailing plans for a new generation of its software stack, within which Qiskit 1.0 will be a pivot point defined by stability and speed. Additionally, and with the goal of democratizing quantum computing development, IBM is announcing Qiskit Patterns.

Qiskit Patterns will serve as a mechanism to allow quantum developers to more easily create code. It is based in a collection of tools to simply map classical problems, optimize them to quantum circuits using Qiskit, executing those circuits using Qiskit Runtime, and then postprocess the results. With Qiskit Patterns, combined with Quantum Serverless, users will be able to build, deploy, and execute workflows integrating classical and quantum computation in different environments, such as cloud or on-prem scenarios. All of these tools will provide building blocks for users to build and run quantum algorithms more easily.

Additionally, IBM is pioneering the use of generative AI for quantum code programming through watsonx, IBMs enterprise AI platform. IBM will integrate generative AI available through watsonx to help automate the development of quantum code for Qiskit. This will be achieved through the finetuning of the IBM Granite model series.

With advanced hardware across IBMs global fleet of 100+ qubit systems, as well as easy-to-use software that IBM is debuting in Qiskit, users and computational scientists can now obtain reliable results from quantum systems as they map increasingly larger and more complex problems to quantum circuits.

Continued here:
133qubit Quantum Heron launched by IBM - Electronics Weekly

IBM showcases quantum computing chip with 2033 target for large computer systems – Proactive Investors USA

About Andrew Kessel

Andrew is a financial journalist with experience covering public companies in a wide breadth of industries, including tech, medicine, cryptocurrency, mining and retail. In addition to Proactive, he has been published in a Financial Times-owned newsletter covering broker-dealer firms and in the Columbia Misourian newspaper as the lead reporter focused on higher education. He got his start with an internship at Rolling Stone magazine. Read more

Proactive financial news and online broadcast teams provide fast, accessible, informative and actionable business and finance news content to a global investment audience. All our content is produced independently by our experienced and qualified teams of news journalists.

Proactive news team spans the worlds key finance and investing hubs with bureaus and studios in London, New York, Toronto, Vancouver, Sydney and Perth.

We are experts in medium and small-cap markets, we also keep our community up to date with blue-chip companies, commodities and broader investment stories. This is content that excites and engages motivated private investors.

The team delivers news and unique insights across the market including but not confined to: biotech and pharma, mining and natural resources, battery metals, oil and gas, crypto and emerging digital and EV technologies.

Proactive has always been a forward looking and enthusiastic technology adopter.

Our human content creators are equipped with many decades of valuable expertise and experience. The team also has access to and use technologies to assist and enhance workflows.

Proactive will on occasion use automation and software tools, including generative AI. Nevertheless, all content published by Proactive is edited and authored by humans, in line with best practice in regard to content production and search engine optimisation.

Read the original post:
IBM showcases quantum computing chip with 2033 target for large computer systems - Proactive Investors USA

IBM brings ‘utility-scale’ quantum computing to Japan as China and Europe struggle to compete – Cointelegraph

IBM announced the completed installation of a 127-qubit quantum computing system at the University of Tokyo on Nov. 27. According to the company, this marks the arrival of the first utility-scale quantum system in the region.

The system, dubbed a Quantum System One by IBM and featuring the companys Eagle processor, was installed as part of an ongoing research partnership between Japan and IBM. According to a blog post from IBM, it will be used to conduct research in various fields, including bioinformatics, materials science and finance.

Per Hiroaki Aihara, executive vice president of the University of Tokyo:

While Japan and the University of Tokyo reap the benefits of working with a U.S. quantum computing partner, Chinas second-largest technology firm, Alibaba, has decided to shutter its own quantum computing laboratory and will reportedly donate its equipment to Zhejiang University.

Local media reports indicate the Alibaba move is a cost-cutting measure and that dozens of employees associated with the quantum research lab have been laid off. This follows the cancellation of a planned cloud computing spinoff earlier this month, with Alibaba stating that thepartial United States export ban on computer chips to China has contributed to uncertainty.

Related: US official confirms military concerns over Chinas access to cloud technology

The quantum computing sector is expected to grow by more than $5.5 billion between 2023 and 2030, according to estimates from Fortune Business Insights. This has led some experts to worry over the state of quantum computing research in areas outside of the U.S. and China.

Koen Bertels, founder of quantum computing accelerator QBee and a professor at the University of Ghent in Belgium, recently opined that Europe had already lost the artificial intelligence race and couldnt afford to lose at quantum computing.

In addition to being behind in funding, talent, and strategy, wrote Bertels, Europe isnt only competing against the US.

See the original post here:
IBM brings 'utility-scale' quantum computing to Japan as China and Europe struggle to compete - Cointelegraph

Quantum Advantage: A Physicist Explains The Future of Computers – ScienceAlert

Quantum advantage is the milestone the field of quantum computing is fervently working toward, where a quantum computer can solve problems that are beyond the reach of the most powerful non-quantum, or classical, computers.

Quantum refers to the scale of atoms and molecules where the laws of physics as we experience them break down and a different, counterintuitive set of laws apply. Quantum computers take advantage of these strange behaviors to solve problems.

There are some types of problems that are impractical for classical computers to solve, such as cracking state-of-the-art encryption algorithms. Research in recent decades has shown that quantum computers have the potential to solve some of these problems.

If a quantum computer can be built that actually does solve one of these problems, it will have demonstrated quantum advantage.

I am a physicist who studies quantum information processing and the control of quantum systems.

I believe that this frontier of scientific and technological innovation not only promises groundbreaking advances in computation but also represents a broader surge in quantum technology, including significant advancements in quantum cryptography and quantum sensing.

Central to quantum computing is the quantum bit, or qubit. Unlike classical bits, which can only be in states of 0 or 1, a qubit can be in any state that is some combination of 0 and 1. This state of neither just 1 or just 0 is known as a quantum superposition. With every additional qubit, the number of states that can be represented by the qubits doubles.

This property is often mistaken for the source of the power of quantum computing. Instead, it comes down to an intricate interplay of superposition, interference and entanglement.

Interference involves manipulating qubits so that their states combine constructively during computations to amplify correct solutions and destructively to suppress the wrong answers.

Constructive interference is what happens when the peaks of two waves like sound waves or ocean waves combine to create a higher peak. Destructive interference is what happens when a wave peak and a wave trough combine and cancel each other out.

Quantum algorithms, which are few and difficult to devise, set up a sequence of interference patterns that yield the correct answer to a problem.

Entanglement establishes a uniquely quantum correlation between qubits: The state of one cannot be described independently of the others, no matter how far apart the qubits are. This is what Albert Einstein famously dismissed as "spooky action at a distance."

Entanglement's collective behavior, orchestrated through a quantum computer, enables computational speed-ups that are beyond the reach of classical computers.

Quantum computing has a range of potential uses where it can outperform classical computers. In cryptography, quantum computers pose both an opportunity and a challenge. Most famously, they have the potential to decipher current encryption algorithms, such as the widely used RSA scheme.

One consequence of this is that today's encryption protocols need to be reengineered to be resistant to future quantum attacks. This recognition has led to the burgeoning field of post-quantum cryptography.

After a long process, the National Institute of Standards and Technology recently selected four quantum-resistant algorithms and has begun the process of readying them so that organizations around the world can use them in their encryption technology.

In addition, quantum computing can dramatically speed up quantum simulation: the ability to predict the outcome of experiments operating in the quantum realm. Famed physicist Richard Feynman envisioned this possibility more than 40 years ago.

Quantum simulation offers the potential for considerable advancements in chemistry and materials science, aiding in areas such as the intricate modeling of molecular structures for drug discovery and enabling the discovery or creation of materials with novel properties.

Another use of quantum information technology is quantum sensing: detecting and measuring physical properties like electromagnetic energy, gravity, pressure and temperature with greater sensitivity and precision than non-quantum instruments.

Quantum sensing has myriad applications in fields such as environmental monitoring, geological exploration, medical imaging and surveillance.

Initiatives such as the development of a quantum internet that interconnects quantum computers are crucial steps toward bridging the quantum and classical computing worlds.

This network could be secured using quantum cryptographic protocols such as quantum key distribution, which enables ultra-secure communication channels that are protected against computational attacks including those using quantum computers.

Despite a growing application suite for quantum computing, developing new algorithms that make full use of the quantum advantage in particular in machine learning remains a critical area of ongoing research.

The quantum computing field faces significant hurdles in hardware and software development. Quantum computers are highly sensitive to any unintentional interactions with their environments. This leads to the phenomenon of decoherence, where qubits rapidly degrade to the 0 or 1 states of classical bits.

Building large-scale quantum computing systems capable of delivering on the promise of quantum speed-ups requires overcoming decoherence. The key is developing effective methods of suppressing and correcting quantum errors, an area my own research is focused on.

In navigating these challenges, numerous quantum hardware and software startups have emerged alongside well-established technology industry players like Google and IBM.

This industry interest, combined with significant investment from governments worldwide, underscores a collective recognition of quantum technology's transformative potential. These initiatives foster a rich ecosystem where academia and industry collaborate, accelerating progress in the field.

Quantum computing may one day be as disruptive as the arrival of generative AI. Currently, the development of quantum computing technology is at a crucial juncture.

On the one hand, the field has already shown early signs of having achieved a narrowly specialized quantum advantage. Researchers at Google and later a team of researchers in China demonstrated quantum advantage for generating a list of random numbers with certain properties. My research team demonstrated a quantum speed-up for a random number guessing game.

On the other hand, there is a tangible risk of entering a "quantum winter," a period of reduced investment if practical results fail to materialize in the near term.

While the technology industry is working to deliver quantum advantage in products and services in the near term, academic research remains focused on investigating the fundamental principles underpinning this new science and technology.

This ongoing basic research, fueled by enthusiastic cadres of new and bright students of the type I encounter almost every day, ensures that the field will continue to progress.

Daniel Lidar, Professor of Electrical Engineering, Chemistry, and Physics & Astronomy, University of Southern California

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Continued here:
Quantum Advantage: A Physicist Explains The Future of Computers - ScienceAlert

Analyst Panel Says Take the Quantum Computing Plunge Now – HPCwire

Should you start exploring quantum computing? Yes, said a panel of analysts convened at Tabor Communications HPC and AI on Wall Street conference earlier this year.

Without doubt, the quantum computing landscape remains murky. Yet in the past ~5 years virtually every aspect of quantum computing has raced forward. At least one 1000-plus-qubit system is edging towards user access now and another is expected by year-end. Theres been a proliferation of software offerings up and down the quantum stack though its hardly complete. Most promising, what were a few POC use-case explorations has mushroomed into very many efforts across many sectors.

What are we waiting for? Against the backdrop of astonishing progress are also very hard technical problems. Error correction/mitigation tops the list. Effective quantum networking is another. Polished applications. Too many qubit types to choose from (at least for now.) Scale matters its expected that millions of qubits may be needed for practical quantum computing These arent trivial challenges. Why bother?

The best reason to proceed, perhaps, is theres little choice. The roaring geopolitical rivalry around getting to practical quantum computing fast which includes robust spending from the U.S., the U.K., the EU, and China, as examples is concrete evidence.

Panelist Bob Sorensen, Hyperion Researchs chief quantum watcher, zeroed in on quantums rush to integrate into what is an otherwise stalling HPC (hardware) picture.

Its no secret in the HPC world that the trajectory of performance gains is flattening out, were reaching a bunch of ends, if you will, the ends of Moores law, the ability to pack more transistors on a chip, Dennard scaling, you can only put so much power into a chip, the idea of lithographic capabilities running out. Were at sub-nanometer line width lithography, [and] theres only one company in the world that makes advanced lithography components, ASML out of out of the Netherlands, that can only supply two really competent silicon foundries to produce the advanced chips that that the HPC world needs TSMC and Samsung, said Sorensen.

So, the trajectory of HPC performance is falling off, and the timing for quantum is perfect. Its the next turn of the crank and accelerating performance. What that means is if you want to continue on your journey of advanced computing, you have to look for the next big thing. Whats interesting about quantum is its potential is the most attractive and it is on a different trajectory than where classical HPC is going right now. Thats really the promise. So, if you want to get started, you have to do a few things.

We look at quantum as not a separate island unto itself of a new compute capability. Its really more about accelerating the most complex, advanced workloads that the HPC world is always tackling. So, we view this as another turn of the crank in terms of Accelerating Opportunities in advanced computing. How you get started is you look at your most complicated vexing computational problems.

It was a fascinating discussion and Tabor has archived the full video (link to video). The focus was not on exotic quantum technologies important but not easily accessible to most of us but on how and why to get started exploring them.

Panelists included Heather West, IDCs lead quantum analyst and a research manager in IDC Infrastructure Systems, Platforms and Technologies Group; Sorensen, Hyperions senior vice president of research and chief quantum analyst; Jay Boisseau, CEO of Vizias and a former prominent Dell Technologies executive and a founder of the Texas Advanced Computing Center. HPCwire editor John Russell moderated.

West presented a handful of slides, nicely mapping the emerging quantum information sciences market, and then the panel tackled why now is the right time and offered tips on how to do it. Central to taking their argument is that quantum is coming on fast, that getting access to tools and QPUs is fairly easy and inexpensive via web-platforms such AWS Braketand Strangeworks, and that failure to get involved now is likely to slow your progress later.

Presented here are just a few comments from the panelists. Lets start with a few slides depicting quantums development, presented by West. Her full slide deck presented in the video.

West noted the quantum forecasts are dynamic in that conditions can change quickly and that IDC incorporates changes as their impact becomes clearer. For example, IDC scaled back is total spending from ~$8.6B in 2027 to $7.6B based on shifts. Despite these shifts, quantum spending plans are growing significantly as a portion of the IT budget.

Over the course of the last 20 years, weve seen [quantum computing] move from an academic achievement to now small-scale systems [that can be used] for small scale experimentation. Hopefully, within the next few years, well be able to see systems that leverage error correction and mitigation techniques as well while little bit scaling to get to deliver some sort of near term advantage, said West.

IDC does a nice job slicing up quantum segments. Looking at the proliferation of quantum hardware developers, she said, We divide them into two different categories, hardware developers versus hardware vendors. The difference between the two is that the vendors have graduated to the point where theyre able to offer access to their systems and services for a premium fee so that organizations such as yours are able to use them, leverage them for some experimentation, use-case identification, etc. (see slide below)

Taking a lesson from the past, Sorensen and Boisseau recalled the historically high-cost of adopting the next-gen HPC systems.

Sorensen said, Whats so magical about quantum right now is, is the beauty of the low barrier to entry. In the old days if you wanted to get an HPC, and Jay knows this, you had to drop $25 million to bring a Cray in. You had to hire 25 guys and they lived downstairs in the basement. They never came out and they wrote code all the time and they spoke a language that you didnt understand, and you had to pay them an awful lot of money to do that. The barriers to entry to get into HPC was high.

The barrier to entry in quantum is you sit down, you go to AWS or Strangeworks. You pick your cloud access model of choice, you sign up for a couple of bucks, you grab a couple of new hires that just came out of with a degree in quantum chemistry or something, and you go and you play, and you figure out how thats going to work. So, the barriers to entry of quantum are amazing. Ive said it before, and Ill say it again, if it wasnt for cloud access, none of us would be sitting here vaguely interested in quantum; its what really is driving interest.

Boisseau had a similar take. You dont have to choose a partner. You dont have to make that decision. In fact, I think itd be a bad play to make that decision now. You can go to any of the CSP-based infrastructure providers (with quantum gateways) and say I want to run this job on a D-Wave system, I want to run this on IonQ, and I want to run on Rigetti Systems, and can do that rather seamlessly, he said.

The interesting thing, and Im an electrical engineer so I tend to look at things very pragmatically, is that right now, a lot of the software thats running out there is quote-unquote, hardware-agnostic, which means you can run it on any (quantum) hardware you want. So again, you dont have to make these choices yet, because its really too early to tell whos going to win, whos going to lose. Hardware-agnostic is really great in the early days, but eventually were going to turn the crank and people are going to start to say we need to optimize our code to run on certain things. But right now, the freedom to explore is what really matters most, said Boisseau.

There was, of course, more to the broad panel discussion, including advice on choosing the right problems, for example, and a measure caution including brief discussion of a paper published last spring (Disentangling Hype from Practicality: On Realistically Achieving Quantum Advantage) by Matthias Troyer, of Microsoft, and colleagues. And Microsoft is firmly in the quantum hunt! (See HPCwire coverage, Microsoft and ETH Take Aim at Quantum Computings Hype (and Promise)

West noted, Not everybodys as optimistic, and some people are still deterred about adopting quantum because of costs, because of the maturity or lack of maturity of the systems, and whether or not it actually will be relevant to the problems that theyre willing to solve. However, for those organizations, they really should start to take note because quantum era is quickly approaching faster than what some might want to say. We still need to put that in a little bit of context: quickly approaching and being able to deliver a near-term advantage, thats probably five to seven years out. So, quick is not going to be in the next six months. Its not going to be next year, but quicker than the decades and decades which were thought earlier.

Best to watch the full video, https://www.hpcaiwallstreet.com/conference/quantum-computing-analyst-panel-one-year-later/

Top image is a photo of the first deployment of an onsite private sector IBM-managed quantum computer in the United States installed at Cleveland Clinic.

Read more:
Analyst Panel Says Take the Quantum Computing Plunge Now - HPCwire