Archive for the ‘Quantum Computer’ Category

Quantum technology professor Pepijn Pinkse: The best time to get quantum security right was yesterday. – Innovation Origins

His inaugural lecture took place early last month; in practice, Pepijn Pinkse has been working as a professor of quantum technology at the University of Twente (UT) for several years. His lecture focused on creating awareness around quantum security and the threat posed by quantum technology. The best time to get quantum security right was yesterday, he said.

Quantum security is crucial to the future of privacy and data security. Professor Pepijn Pinkse, a pioneer at the University of Twente, is developing groundbreaking methods to secure data in an unbreakable way. Twente leads the world when it comes to quantum technology.

See this laser beam? Its a neat bundle of light waves falling in line. Pinkse holds an A4 sheet in front of the camera on which he shines a laser pointer. If I put a piece of tape on the laser, you can see that a complex pattern of speckles forms in the light. As soon as you add five photons to that pattern, they distribute themselves among the speckles. We use the combination of quantum light light with a small number of photons and a complex pattern to read out a key.

Or, in other words, Pinkse has developed a key that cannot be copied, even when someone has all the information. The key is verified by shining a light pulse on it with fewer photons (light particles) than there are spatial degrees of freedom (speckles). The professors contribution was instrumental in inventing this Quantum-Secure Authentication method, which was largely developed in Twente.

Developing authentication methods is so important because the advent of quantum computers poses risks to data security. Once quantum computers are powerful and reliable enough, most current cryptographic security methods of the Internet and data files will be vulnerable overnight.

Quantum computers operate on a different principle than classical computers. The main difference is the fundamental unit of information, or bit. The conventional digital bit knows no more than two states 0 or 1 and thus performs calculations incrementally. The information unit of a quantum computer qubit can be in both states simultaneously. This condition is referred to as superposition. Because of this parallel mode of operation, the computation time on a quantum computer grows much less rapidly with the size of the problem, and in the future, they can solve complex tasks that are too difficult for classical computers, Pinkse explains.

Pinkse studied physics at Leiden University and received his doctorate from the University of Amsterdam. He spent ten years at the renowned Max-Planck Institute for Quantum Optics. In 2009, he transferred to UT, where he did pioneering work on quantum secure authentication. He received a Vici grant, the Dutch Research Councils (NWO) highest personal grant, for his research in 2013. Since 2019, Pinkse has been a professor of Adaptive Quantum Optics. He is also the director of the center for Quantum NanoTechnology Twente (QUANT) and co-founder of spin-off Quix-Qantum.

Most of our current cryptography, think of Internet banking, for example, is based on the fact that you can easily multiply two large prime numbers together, Pinkse explains. Making the sum the other way around is difficult.

Prime numbers are divisible only by 1 and themselves, such as 7, 11, or 61; numbers like 6 and 15 are not. Consider the following calculation: 71 x 61 = 4331. Determining which multiplication 4331 is the result is much more difficult because you have to try numerous options.

Pinkse: The Shor algorithm can make that reverse computation efficiently, although it needs a large and good universal quantum computer to do so. As a result, much of our encrypted data is no longer secure.

This is not yet the case, as quantum computers currently have a small number of memory elements (qubits) and are noisy. The professor expects that it will be about ten years before Q-day the day when current cryptographic security systems succumb to the pressure of quantum computers happens. But that doesnt mean we shouldnt take action now, Pinkse warns. If in ten years there is a working universal quantum computer that can decipher eavesdropped messages from today, we need to start protecting against it now with encryption techniques that cannot be broken even then.

Quantum technology has long been used in semiconductors, lasers, and MRI scanners. Even smartphones and the Internet would not exist without this technology. However, these applications do not (yet) use quantum information based on specific properties of quantum particles, such as entanglement. Applications Pinkse talks about in this article aka Quantum Technology 2.0 do.

Besides the risks involved in this elusive technology, the potential is huge. Pinske: Quantum computers can help us understand chemical reactions much better and be able to make smarter batteries and more effective medicines. Quantum is a key technology for the energy transition and health care.

The first universal prototype of a quantum computer might come out of Twente. At UT spin-off Quix Quantum, of which Pinkse is a co-founder they are developing a universal quantum computer. The company hopes to have the prototype ready in three years; the computer has already been sold to the German center for aerospace. Twentes quantum computer runs on light and is leading the way in Europe. Pinkse: The American competitor PsiQuantum has raised hundreds of millions in investments but has not sold anything yet.

Whereas commerce was often a dirty word in the early years of his career, at UT, it is anything but such. At many institutes and universities, commerce is an afterthought that distracts from the science itself. Here, the emphasis is very much on the contribution you can make to society through your research. That makes my work incredibly fun.

Quandela takes the quantum computer from lab to fab for first time

Quandelas new factory, South of Paris, will produce three machines in six months instead of one currently.

More here:
Quantum technology professor Pepijn Pinkse: The best time to get quantum security right was yesterday. - Innovation Origins

3 Stocks Leading the Quantum Computing Revolution – InvestorPlace

In recent times, the fascination with quantum computing has surged, driven by technological advancements and a notable uptick in investments. More and more companies and institutes are performing a comprehensive exploration of the quantum-computing landscape and searching for use cases. Along these lines, investors are seeking to better understand quantum computing stocks and how they will affect various sectors.

For instance, there has been a lot of focus on the pharmaceutical industry in recent years. This industry has been affected by quantum computing through the recruitment of quantum scientists. They have set out to explore potential applications like quantum simulation in drug design.

Moreover, quantum technology holds significant implications for the financial-services sector. Particularly in the realm of security, signifying its potential to reshape how businesses and industries operate.

Here are three leading quantum computing stocks that could lead the next tech revolution.

Source: JHVEPhoto / Shutterstock.com

International Business Machines (NASDAQ:IBM) is probably the safest bet on quantum computing. And arguably one of the top quantum computation stocks one should own. IBM operates in various domains, including cloud computing, artificial intelligence and data analytics.

With a global presence, IBM provides solutions and services to enterprises, leveraging its expertise in cutting-edge technologies. Most recently, the company has increased its focus on AI and machine learning applications.

Through the IBM Quantum Network, the company collaborates with over 250 Fortune 500 companies, universities, labs. and startups. The network fosters partnerships and providing exclusive access to meetings and channels. In addition, it also facilitates collaborative efforts and close interactions with IBMs internal experts.

Most recently, IBM introduced the IBM Quantum Heron, marking the first in a new series of utility-scale quantum processors. Engineered over four years, it boasts IBMs highest performance metrics and lowest error rates among its quantum processors.

Furthermore, IBM unveiled the Quantum System Two, its inaugural modular quantum computer. The system is operational with three IBM Heron processors and associated control electronics, representing a significant step in IBMs quantum-centric supercomputing architecture.

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Quantum Computing (NASDAQ:QUBT) is the pure-play name to own in the quantum computing sector. QUBT aims to provide widely accessible and cost-effective quantum solutions for real-world business applications.

The companys model is based on vendor-neutral software and ready-to-run systems. These offer business users immediate access to various quantum processing units and quantum technologies.

The acquisition of QPhoton, a quantum photonics innovation company, enhances QUBTs capabilities with a series of quantum photonic systems (QPS). The integration of Quantums flagship software, Qatalyst, with QPhotons QPS positions the company to provide a widely accessible and cost-effective quantum solution.

Quantum achieved a significant milestone recently in commercializing its cutting-edge computing technologies, securing hardware sales of its Reservoir Computer and Quantum Random Number Generator. The Reservoir Computer reportedly exhibits exceptional speed and efficiency. These properties allows it to enhance data analysis and machine learning as well as other applications.

In Q3 2023, QUBT reported an earnings per share negative 11 cents. While this metric was up over 50% year-over-year, the aforementioned technological milestones are anticipated to bring a further boost to EPS, and QUBTs future prospects overall.

Source: Asif Islam / Shutterstock.com

The Redmond-based tech titan has made significant efforts in the last decade to diversify its product family and become less dependent on the sales of software products. In addition to cloud, gaming and AI, Microsoft (NASDAQ:MSFT) has also made significant strides to improve its understanding of quantum computing technology.

For instance, the company works to highlight its work on Azure Quantum. Azures aim is to achieve scalability towards the realization of a general-purpose quantum computer. Along these lines, Azure Quantum applications are crafted to empower quantum chemists and scientists in their research endeavors.

More precisely, the tech giant says it is actively working to achieve quantum at scale by developing a stable qubit, and introducing a comprehensive, fault-tolerant quantum machine to Azure. To aid these efforts MSFT is working with a long-term approach through its lab at the University of Sydney. The lab is trying to develop quantum computers at the scale needed for applications with real impact. The project is headed by Dr. David Reilly, who has already developed a cryogenic quantum control platform.

On the date of publication, Shane Neagle did not hold (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.

Shane Neagle is fascinated by the ways in which technology is poised to disrupt investing. He specializes in fundamental analysis and growth investing.

Go here to read the rest:
3 Stocks Leading the Quantum Computing Revolution - InvestorPlace

What is IBM doing in the race towards quantum computing? – TechHQ

Quantum computing uses electrons rather than transistors, for a much more rapid solution to complex problems. Theres every likelihood that the technology will be able to rapidly reduce current encryptions to dust. The quantum race is largely between China and a handful of western companies.

We may be on the verge of revolutionary AI problem-solving with news of IBMs quantum computing advancements. (We say may in tribute to Werner Heisenberg and his famous principle, and because nothing since has ever been entirely certain in the quantum world).

We are living in a golden age of artificial intelligence, with innovations seemingly bombarding us every day. The trend has continued with IBM announcing advancements in a new kind of computing that is capable of solving extraordinarily complex problems in just a few minutes.

Why is this newsworthy? Surely thats what all computers do?

Yes, but todays supercomputers would need millions of years to solve problems as complex as the ones IBM is making progress with.

Welcome to the wonderful world of quantum.

Quantum computing is a technology being developed by companies like IBM and Google. Operating in a fundamentally different way to classical computing, it relies on quantum bits (qubits) and principles including superposition and entanglement. As the name suggests, quantum physics is an intrinsic part of quantum computing. We may even need a quantum computer to explain how this type of computing works, but this technology is without question changing the world.

Everything we know is pushed to the limits with quantum computing. From science to finances and from AI to computational power, this supercomputer offers the potential for solutions to problems that are currently intractable for classical computers.

The revolutionary nature of quantum computing lies in its potential to transform problem-solving approaches. It has the potential to tackle previously unsolvable problems, and impact many fields worldwide. It presents a paradigm shift akin to the introduction of classical computing, though in comparison, quantum computings possibilities are on a vastly different and exponentially more powerful scale.

IBM director of research Dario Gill believes quantum computing will have a significant impact on the world, but that society is not yet prepared for such changes.

It feels to us like the pioneers of the 1940s and 50s that were building the first digital computers, he said. Its plain to see how much impact digital computers have had on the world since the 1950s, but quantum computing is another kettle of deeply unusual fish.

We are now at a stage where we can do certain calculations with these systems that would take the biggest supercomputers in the world to do, Gill explained. But the potential of this technology is only just being realized. The goal is to continue the expansion of quantum computing capabilities, so that not even a million or a billion of those supercomputers connected together could do the calculations of these future machines.

A quantum computer from IBM the future appears to be agreeably steampunk.

We have already witnessed significant progress in this field of technology, but the difference now is that Dario Gill, and others working in the quantum field, have a clear plan or strategy in place for further advancements. That means the rate of progress is only expected to accelerate possibly at a pace that will surprise the world.

Today, computers process information on transistors, something they have done since the advent of the transistor switch in 1947. Over time, however, the speed and capabilities of computers have increased substantially. This is due to the continuous advancement of technology. This enhancement stems from the strategy of densely integrating an increasing number of transistors onto a single chip, reaching a scale of billions of transistors in todays computer chips.

Computers require billions of transistors because they are in either an on or off state. Known as complementary metal-oxide-semiconductor (CMOS) technology, quantum computing is now presenting alternatives to this hallmark of classic computing.

Rather than using transistors, quantum computing encodes information and data on electrons. These particles, thanks to the rules of quantum mechanics, can exist in multiple states simultaneously, much like a coin spinning in the air. Simultaneously, it shows aspects of both heads and tails. Unlike traditional computing methods, that deal with one bit of data at a time on a transistor, quantum computing uses qubits. These can store and process exponentially more information because of their ability to exist in multiple states at once.

Classical computers require a step-by-step process when finding information or solving problems. Quantum computers, on the other hand, are capable of finding solutions much faster by handling numerous possibilities concurrently.

Like any up-and-coming technology, countries around the world are vying for quantum supremacy. Currently, private free enterprises and state-directed communism are the main competitors. In other words, the race is between China on one side, and IBM, Google, Microsoft, [and] Honeywell, according to physicist Michio Kaku. These are the big boys of quantum computing.

America has approximately 180 private firms researching quantum computing, most of which fund themselves. The US also has a number of government initiatives investing heavily in quantum research. Along with IBM, Google, and Microsoft, institutions including NASA, DARPA, and NIST are at the forefront of quantum computing and technology development.

Quantum computing bringing the sci-fi home.

China has been making substantial investments in quantum development and research for a number of years. For instance, it has several state-backed initiatives and research institutions, including the Chinese Academy of Sciences, all working on quantum technology. Large corporations, including Alibaba and Huawei, are also involved in quantum computing research.

The US government currently spends close to $1 billion a year on quantum research, whereas China has named quantum as a top national priority. New standards for encryption are to be published by the US in 2024, something that will cause waves (or potentially particles) in the quantum field.

If youre looking for revolutions in computing as big as quantum, youre probably looking back to the machine that cracked the Enigma code

The winner of this quantum race will have striking implications, as Kaku believes the nation or company that succeeds will rule the world economy.

Think OpenAI and ChatGPT, but with the potential to crack any code, open any safe, and of course, demand any price.

As we immerse ourselves in quantum computings promising possibilities and how it is a savior to all of humanitys problems, we must not forget the challenges it also faces. For instance, coherence times need to be enhanced and machines require scaling up to operate effectively with quantum computing.

Hartmut Neven, founder and manager of Googles Quantum Artificial Intelligence Lab, believes that small improvements and effective integration of existing pieces are key to building larger quantum systems. We need little improvements here and there. If we have all the pieces together, we just need to integrate them well to build larger and larger systems.

Neven and his team aim to achieve significant progress in quantum computing over the next five or six years. He believes that quantum computing holds the key to solving problems in fields like chemistry, physics, medicine, and engineering that classical computers are currently, and will always, be incapable of. You actually require a different way to represent information and process information. Thats what quantum gives you, he explained.

Further challenges persist due to the delicate nature of qubits, which are prone to errors and interference from the surrounding environment. As James Tyrrell discusses here, efforts to mitigate this noise and enhance the reliability of quantum computers are underway. The expansion of the (Quantum-Computing-as-a-Service) QCaaS ecosystem is expected to shift the focus from technical intricacies to practical applications. This will potentially allow users to harness the power of quantum computing for real-world problem-solving.

The development of quantum computing is accelerating at an exponential rate. Over the next decade or so, Dario Gil sees no reason why quantum computing can expand to thousands of qubits. He believes that systems will be built that will have tens of thousands and even a 100 thousand qubits working with each other. Where quantum technology goes from here is (thank you, Werner!) distinctly uncertain, but if the excitement is anything to go by, it may potentially have the answers to all the worlds problems.

See the article here:
What is IBM doing in the race towards quantum computing? - TechHQ

The Future of Quantum Computing: Harvard Team Achieves Major Error Correction Milestone – SciTechDaily

Quantum computing has made a significant leap forward with Harvards new platform, capable of dynamic reconfiguration and demonstrating low error rates in two-qubit entangling gates. This breakthrough, highlighted in a recent Nature paper, signals a major advancement in overcoming the quantum error correction challenge, positioning Harvards technology alongside other leading quantum computing methods. The work, a collaboration with MIT and others, marks a crucial step towards scalable, error-corrected quantum computing. Credit: SciTechDaily.com

Quantum computing technology, with its potential for unprecedented speed and efficiency, significantly surpasses the capabilities of even the most advanced supercomputers currently available. However, this innovative technology has not been widely scaled or commercialized, primarily because of its inherent limitations in error correction. Quantum computers, unlike classical ones, cannot correct errors by copying encoded data over and over. Scientists had to find another way.

Now,a new paper inNatureillustrates a Harvard quantum computing platforms potential to solve the longstanding problem known as quantum error correction.

Leading the Harvard team isquantum optics expert Mikhail Lukin, the Joshua and Beth Friedman University Professor in physics and co-director of theHarvard Quantum Initiative. The work reported in Nature was a collaboration among Harvard, MIT, and Boston-basedQuEra Computing. Also involved was the group ofMarkus Greiner, the George Vasmer Leverett Professor of Physics.

An effort spanning the last several years,the Harvard platformis built on an array ofvery cold, laser-trappedrubidium atoms. Each atom acts as a bit or a qubit as its called in the quantum world which can perform extremely fast calculations.

The teams chief innovation is configuring their neutral atom array to be able to dynamically change its layout by moving and connecting atoms this is called entangling in physics parlance mid-computation. Operations that entangle pairs of atoms, called two-qubit logic gates, are units of computing power.

Running a complicated algorithm on a quantum computer requires many gates. However, these gate operations are notoriously error-prone, and a buildup of errors renders the algorithm useless.

In the new paper, the team reports near-flawless performance of its two-qubit entangling gates with extremely low error rates. For the first time, they demonstrated the ability to entangle atoms with error rates below 0.5 percent. In terms of operation quality, this puts their technologys performance on par with other leading types of quantum computing platforms, like superconducting qubits and trapped-ion qubits.

However, Harvards approach has major advantages over these competitors due to its large system sizes, efficient qubit control, and ability to dynamically reconfigure the layout of atoms.

Weve established that this platform has low enough physical errors that you can actually envision large-scale, error-corrected devices based on neutral atoms, said first author Simon Evered, a Harvard Griffin Graduate School of Arts and Sciences student in Lukins group. Our error rates are low enough now that if we were to group atoms together into logical qubits where information is stored non-locally among the constituent atoms these quantum error-corrected logical qubits could have even lower errors than the individual atoms.

The Harvard teams advancesare reportedin the same issue of Nature as other innovations led by former Harvard graduate studentJeff Thompson, now at Princeton University, and former Harvard postdoctoral fellowManuel Endres, now at California Institute of Technology. Taken together, these advances lay the groundwork for quantum error-corrected algorithms and large-scale quantum computing. All of this means quantum computing on neutral atom arrays is showing the full breadth of its promise.

These contributions open the door for very special opportunities in scalable quantum computing and a truly exciting time for this entire field ahead, Lukin said.

Reference: High-fidelity parallel entangling gates on a neutral-atom quantum computer by Simon J. Evered, Dolev Bluvstein, Marcin Kalinowski, Sepehr Ebadi, Tom Manovitz, Hengyun Zhou, Sophie H. Li, Alexandra A. Geim, Tout T. Wang, Nishad Maskara, Harry Levine, Giulia Semeghini, Markus Greiner, Vladan Vuleti and Mikhail D. Lukin, 11 October 2023,Nature. DOI: 10.1038/s41586-023-06481-y

The research was supported by the U.S. Department of Energys Quantum Systems Accelerator Center; the Center for Ultracold Atoms; the National Science Foundation; the Army Research Office Multidisciplinary University Research Initiative; and the DARPAOptimization with Noisy Intermediate-Scale Quantum Devices program.

Read the rest here:
The Future of Quantum Computing: Harvard Team Achieves Major Error Correction Milestone - SciTechDaily

Harvard, QuEra, MIT, and the NIST/University of Maryland Usher in New Era of Quantum Computing by Performing … – AZoQuantum

QuEra Computing, the leader in neutral-atom quantum computers, today announced a significant breakthrough published in the scientific journal Nature. In experiments led by Harvard University in close collaboration with QuEra Computing, MIT, and NIST/UMD, researchers successfully executed large-scale algorithms on an error-corrected quantum computer with 48 logical qubits and hundreds of entangling logical operations. This advancement, a significant leap in quantum computing, sets the stage for developing truly scalable and fault-tolerant quantum computers that could solve practical classically intractable problems.

"We at Moodys Analytics recognize the monumental significance of achieving 48 logical qubits in a fault-tolerant quantum computing environmentand its potential to revolutionize data analytics and financial simulations, said Sergio Gago, Managing Director of Quantum and AI at Moodys Analytics. This brings us closer to a future where quantum computing is not just an experimental endeavor but a practical tool that can deliver real-world solutions for our clients. This pivotal moment could redefine how industries approach complex computational challenges."

A critical challenge preventing quantum computing from reaching its enormous potential is the noise that affects qubits, corrupting computations before reaching the desired results. Quantum error correction overcomes these limitations by creating logical qubits," groups of physical qubits that are entangled to store information redundantly. This redundancy allows for identifying and correcting errors that may occur during quantum computations. By using logical qubits instead of individual physical qubits, quantum systems can achieve a level of fault tolerance, making them more robust and reliable for complex computations.

This is a truly exciting time in our field as the fundamental ideas of quantum error correction and fault tolerance are starting to bear fruit, said Mikhail Lukin, the Joshua and Beth Friedman University Professor, co-director of the Harvard Quantum Initiative, and co-founder of QuEra Computing. This work, leveraging the outstanding recent progress in the neutral-atom quantum computing community, is a testament to the incredible effort of exceptionally talented students and postdocs as well as our remarkable collaborators at QuEra, MIT, and NIST/UMD.Although we are clear-eyed about the challenges ahead, we expect that this new advance will greatly accelerate the progress towards large-scale, useful quantum computers, enabling thenext phase of discovery and innovation.

Previous demonstrations of error correction have showcased one, two, or three logical qubits. This new work demonstrates quantum error correction in 48 logical qubits, enhancing computational stability and reliability while addressing the error problem. On the path to large-scale quantum computation, Harvard, QuEra, and the collaborators reported the following critical achievements:

Creation and entanglement of the largest logical qubits to date, demonstrating a code distance of 7, enabling the detection and correction of arbitrary errors occurring during the entangling logical gate operations. Larger code distances imply higher resistance to quantum errors. Furthermore, the research showed for the first time that increasing the code distance indeed reduces the error rate in logical operations.

The breakthrough utilized an advanced neutral-atom system quantum computer, combining hundreds of qubits, high two-qubit gate fidelities, arbitrary connectivity, fully programmable single-qubit rotations, and mid-circuit readout.

The system also included hardware-efficient control in reconfigurable neutral-atom arrays, employing direct, parallel control over an entire group of logical qubits. This parallel control dramatically reduces the control overhead and complexity of performing logical operations. While using as many as 280 physical qubits, researchers needed to program fewer than ten control signals to execute all of the required operations in the study. Other quantum modalities typically require hundreds of control signals for the same number of qubits. As quantum computers scale to many thousands of qubits, efficient control becomes critically important.

"The achievement of 48 logical qubits with high fault tolerance is a watershed moment in the quantum computing industry, said Matt Langione, Partner at the Boston Consulting Group. This breakthrough not only accelerates the timeline for practical quantum applications but also opens up new avenues for solving problems that were previously considered intractable by classical computing methods. It's a game-changer that significantly elevates the commercial viability of quantum computing. Businesses across sectors should take note, as the race to quantum advantage just got a major boost."

"Today marks a historic milestone for QuEra and the broader quantum computing community, said Alex Keesling, CEO, QuEra Computing, These achievements are the culmination of a multi-year effort, led by our Harvard and MIT academic collaborators together with QuEra scientists and engineers, to push the boundaries of what's possible in quantum computing. This isn't just a technological leap; it's a testament to the power of collaboration and investment in pioneering research. We're thrilled to set the stage for a new era of scalable, fault-tolerant quantum computing that can tackle some of the world's most complex problems. The future of quantum is here, and QuEra is proud to be at the forefront of this revolution."

Our experience in manufacturing and operating quantum computers - such as our first-generation machine available on a public cloud since 2022 - coupled with this groundbreaking research, puts us in a prime position to lead the quantum revolution, added Keesling.

The work was supported by the Defense Advanced Research Projects Agency through the Optimization with Noisy Intermediate-Scale Quantum devices (ONISQ) program, the National Science Foundation, the Center for Ultracold Atoms (an NSF Physics Frontiers Center), and the Army Research Office.

QuEra also announced a special event on Jan 9th at 11:30 AM ET, where QuEra will reveal its commercial roadmap for fault-tolerant quantum computers. Register for this online event athttps://quera.link/roadmap

Source:https://www.quera.com/

Read the original post:
Harvard, QuEra, MIT, and the NIST/University of Maryland Usher in New Era of Quantum Computing by Performing ... - AZoQuantum