Archive for the ‘Quantum Computer’ Category

Quantum computing: The five biggest breakthroughs – Engineers Ireland

Quantum computing is a revolutionary technology already making waves in many industries, such as drug discovery, cryptography, finance, and logistics. It works by exploiting quantum mechanical phenomena to perform complex computations in a fraction of the time classical computers require. Two main quantum mechanical phenomena drive quantum computers' speed and computational prowess superposition and entanglement.

Unlike classical computers, which operate on binary bits (0 and 1), quantum computers operate on quantum bits or qubits. Qubits can exist in a state of superposition. This means that any qubit has some probability of existing simultaneously in the 0 and 1 states, exponentially increasing the computational power of quantum computers.

Another unique property that qubits have is their ability to become entangled. This means that two qubits, no matter how physically far, are correlated so that knowing the state of one particle automatically tells us something about its companion, even when they are far apart. This correlation can be harnessed for processing vast amounts of data and solving complex problems that classical computers cannot.

Classical computers only have the power to simulate phenomena based on classical physics, making it more difficult or slower to solve problems that rely on quantum phenomena. This is where the true importance of quantum computers lies.

Since quantum computers are based on qubits, they can solve challenging problems using classical computers and revolutionise many industries. For example, quantum computers can rapidly simulate molecules and chemical reactions, discovering new drugs and materials with exceptional properties.

Although significant breakthroughs have been made in quantum computing, we are still in the nascent stages of its development.

The objective of quantum supremacy is to demonstrate that a quantum computer can solve a problem that no classical computer can solve in any reasonable length of time, despite the usefulness of the problem. Achieving this goal demonstrates the power of a quantum computer over a classical computer in complex problem-solving.

InOctober 2019, Google confirmedthat it had achieved quantum supremacy using its fully programmable 54-qubit processor called Sycamore. They solved a sampling problem in 200 seconds which would take a supercomputer nearly 10,000 years to solve. This marked a significant achievement in the development of quantum computing.

Richard Feynman first theorised the idea of using quantum mechanics to perform calculations impossible for classical computers. Image:Unknown/Wikimedia Commons

Since then, many researchers have demonstrated quantum supremacy by solving various sampling problems. The impact of achieving quantum supremacy cannot be overstated. It validates the potential of quantum computing to solve problems beyond the capabilities of classical computers, as first theorised by Richard Feynman in the 1980s.

Apart from sampling problems, other applications have been proposed for demonstrating quantum supremacy, such as Shor's algorithm for factoring integers which are extremely important in encryption. However, implementing Shor's algorithm for large numbers is not feasible with existing technology and is hence not the preferred oversampling algorithm for demonstrating supremacy.

The most pressing concern with quantum computers is their sensitivity to errors induced by environmental noise and imperfect control. This hinders their practical usability, as data stored on a quantum computer can become corrupted.

Classical error correction relies on redundancy, ie, repetition. However, quantum information cannot be cloned or copied due to the no-cloning theorem (which states thatit is impossible to create an independent and identical copy of an arbitrary unknownquantum state). Therefore, a new error correction method is required for quantum computing systems.

QEC for a single qubit. Image:Self/Wikimedia Commons

Quantum error correction (QEC) is a way to mitigate these errors and ensure that the data stored on a quantum computer is error-free, thus improving the reliability and accuracy of quantum computers.

The principle of QEC is to encode the data stored on a quantum computer such that the errors can be detected and corrected without disrupting the computation being performed on it.

This is done using quantum error-correction codes (QECCs). QECCs work by encoding the information onto a larger state space. They further correct the error without measuring the quantum state, thereby preventing the collapse of the quantum state.

The first experimental demonstration of QEC was done in 1998with nuclear magnetic resonance qubits. Since then, several experiments to demonstrate QEC have been performed using, for example, linear optics and trapped ions, among others.

A significant breakthrough camein 2016 when researchers extended the lifespan of a quantum bit using QEC. Their research showed the advantage of using hardware-efficient qubit encoding over traditional QEC methods for improving the lifetime of a qubit.

The detection and elimination of errors is critical to developing realistic quantum computers. QEC handles errors in the stored quantum information, but what about the errors after performing operations? Is there a way to correct those errors and ensure that the computations are not useless?

Fault-tolerant quantum computing is a method to ensure that these errors are detected and corrected using a combination of QECCs and fault-tolerant gates. This ensures that errors arising during the computations don't accumulate and render them worthless.

Quantum computing features. Image:Akash Sain/iStock

The biggest challenge in achieving fault-tolerant quantum computing is the need for many qubits. QECCs themselves require a lot of qubits to detect and correct errors.

Additionally, fault-tolerant gates also require a large number of qubits. However, two independent theoretical studies published in1998and2008proved that fault-tolerant quantum computers can be built. This has come to be known as the threshold theorem, which states that if the physical error rates of a quantum computer are below a certain threshold, the logical error rate can be suppressed to arbitrarily low values.

No experimental findings have proven fault-tolerant quantum computing due to the high number of qubits needed. The closest we've come to an experimental realisation is a2022 study published in Nature,demonstrating fault-tolerant universal quantum gate operations.

We have seen teleportation one too many times in science fiction movies and TV shows. But are any researchers close to making it a reality? Well, yes and no. Quantum teleportation allows for transferring one quantum state from one physical location to another without physically moving the quantum state itself. It has a wide range of applications, from secure quantum communication to distributed quantum computing.

Quantum teleportation wasfirst investigated in 1993by scientists who were using it as a way to send and receive quantum information. It was experimentally realised only four years later, in 1997, by two independent research groups. The basic principle behind quantum teleportation is entanglement (when two particles remain connected even when separated by vast distances).

Since 1997, many research groups have demonstrated the quantum teleportation of photons, atoms, and other quantum particles. It is the only real form of teleportation that exists.

In fact, the 2022 Nobel Prize in Physics was awarded to three scientists Alain Aspect, John Clauser, and Anton Zeilinger for experiments with entangled photons. The work demonstrated that teleportation between photons was possible. Their work demonstrated quantum entanglement and showed it could be used to teleport quantum information from one photon to another.

Quantum teleportation is the cornerstone for building a quantum internet. This is because it enables the distribution of entanglement over long distances.

Another important application of quantum teleportation is enabling remote quantum operations, meaning that a quantum computation can be performed on a distant processor without transmitting the qubits. This could be useful for secure communication and for performing quantum computations in inaccessible or hostile environments.

Topology is a branch of mathematics concerned with studying the properties of shapes and spaces preserved when deformed. But what does it have to do with quantum computing?

In essence, topological quantum computing is a theoretical model that uses quasiparticles called anyons (quasiparticles in two-dimensional space) for encoding and manipulating qubits.

The method is founded on the topological properties of matter, and in the case of anyons, the world lines (the path that an object traces in four-dimensional spacetime) of these particles form braids. These braids then make up the logic gates which are the building blocks of computers.

No experimental studies demonstrate topological quantum computing. Image:FMNLab/Wikimedia Commons

Topological qubits are protected against local perturbations and can be manipulated with high precision, making them less susceptible to decoherence. Additionally, topological quantum computing is more resistant to errors due to its inherent redundancy and topological protection, making it a promising candidate for fault-tolerant quantum computing.

Most topological quantum computing research is theoretical; currently, no studies provide substantial experimental support for the same. But, developments in this area of research are vital for building practical and scalable quantum computers.

With a mix of theoretical and experimental demonstrations, quantum computing is still in the early stages of research and development. These developments can potentially revolutionise several industries and academic disciplines, including financial services, materials science, cryptography, and artificial intelligence.

Even if there is still more study, the implications for quantum computing's future are promising. We may anticipate further developments and innovations in the years to come.

Continued here:
Quantum computing: The five biggest breakthroughs - Engineers Ireland

Accelerating the Accelerator: Scientist Speeds CERN’s HPC With … – Nvidia

Editors note: This is part of a series profiling researchers advancing science with high performance computing.

Maria Girone is expanding the worlds largest network of scientific computers with accelerated computing and AI.

Since 2002, the Ph.D. in particle physics has worked on a grid of systems across 170 sites in more than 40 countries that support CERNs Large Hadron Collider (LHC), itself poised for a major upgrade.

A high-luminosity version of the giant accelerator (HL-LHC) will produce 10x more proton collisions, spawning exabytes of data a year. Thats an order of magnitude more than it generated in 2012 when two of its experiments uncovered the Higgs boson, a subatomic particle that validated scientists understanding of the universe.

Girone loved science from her earliest days growing up in Southern Italy.

In college, I wanted to learn about the fundamental forces that govern the universe, so I focused on physics, she said. I was drawn to CERN because its where people from different parts of the world work together with a common passion for science.

Tucked between Lake Geneva and the Jura mountains, the European Organization for Nuclear Research is a nexus for more than 12,000 physicists.

Its 27-kilometer ring is sometimes called the worlds fastest racetrack because protons careen around it at 99.9999991% the speed of light. Its superconducting magnets operate near absolute zero, creating collisions that are briefly millions of times hotter than the sun.

In 2016, Girone was named CTO of CERN openlab, a group that gathers academic and industry researchers to accelerate innovation and tackle future computing challenges. It works closely with NVIDIA through its collaboration with E4 Computer Engineering, a specialist in HPC and AI based in Italy.

In one of her initial acts, Girone organized the CERN openlabs first workshop on AI.

Industry participation was strong and enthusiastic about the technology. In their presentations, physicists explained the challenges ahead.

By the end of the day we realized we were from two different worlds, but people were listening to each other, and enthusiastically coming up with proposals for what to do next, she said.

Today, the number of publications on applying AI across the whole data processing chain in high-energy physics is rising, Girone reports. The work attracts young researchers who see opportunities to solve complex problems with AI, she said.

Meanwhile, researchers are also porting physics software to GPU accelerators and using existing AI programs that run on GPUs.

This wouldnt have happened so quickly without the support of NVIDIA working with our researchers to solve problems, answer questions and write articles, she said. Its been extremely important to have people at NVIDIA who appreciate how science needs to evolve in tandem with technology, and how we can make use of acceleration with GPUs.

Energy efficiency is another priority for Girones team.

Were working on experiments on a number of projects like porting to lower power architectures, and we look forward to evaluating the next generation of lower power processors, she said.

To prepare for the HL-LHC, Girone, named head of CERN openlab in March, seeks new ways to accelerate science with machine learning and accelerated computing. Other tools are on the near and far horizons, too.

The group recently won funding to prototype an engine for building digital twins. It will provide services for physicists, as well as researchers in fields from astronomy to environmental science.

CERN also launched a collaboration among academic and industry researchers in quantum computing. The technology could advance science and lead to better quantum systems, too.

In another act of community-making, Girone was among four co-founders of a Swiss chapter of the Women in HPC group. It will help define specific actions to support women in every phase of their careers.

Im passionate about creating diverse teams where everyone feels they contribute and belong its not just a checkbox about numbers, you want to realize a feeling of belonging, she said.

Girone was among thousands of physicists who captured some of that spirit the day CERN announced the Higgs boson discovery.

She recalls getting up at 4 a.m. to queue for a seat in the main auditorium. It couldnt hold all the researchers and guests who arrived that day, but the joy of accomplishment followed her and others watching the event from a nearby hall.

I knew the contribution I made, she said. I was proud being among the many authors of the paper, and my parents and my kids felt proud, too.

Check out other profiles in this series:

More:
Accelerating the Accelerator: Scientist Speeds CERN's HPC With ... - Nvidia

Daily briefing: Quantum computers are all ‘terrible’ but researchers aren’t worried – Nature.com

Hello Nature readers, would you like to get this Briefing in your inbox free every day? Sign up here.

LIGO can detect gravitational waves that are generated when two black holes collide.Credit: The SXS Project

The Laser Interferometer Gravitational-Wave Observatory (LIGO) is back after a three-year hiatus and a multimillion-dollar upgrade. The first detection of gravitational waves ripples in spacetime from colliding black holes and other cosmic cataclysms was made at LIGO in 2015. Improvements to the detectors sensitivity mean that LIGO could pick up signals of colliding black holes every few days, compared with once a week during its previous run. Scientists hope to detect the gravitational signal of a collapsing star before it manifests as a supernova explosion, as well as the continuous gravitational waves produced by a pulsar.

Nature | 6 min read

A wireless connection between the brain and the spinal cord allows a paralysed man to walk using his thoughts. Gert-Jan Oskam, whose legs were paralysed after a cycling accident, received a spinal implant in 2018 that generated robotic movement through pre-programmed electrical stimulation. He has now received head implants that detect brain activity and transmit the signal to a backpack computer, which decodes the information and activates the spinal pulse generator. This brainspine interface gives Oskam full control over the stimulation, so he can walk and climb stairs. The stimulation before was controlling me and now I am controlling stimulation by my thought, he says.

Nature | 4 min read

Reference: Nature paper

A component in their mothers milk triggers a diet switch in baby mices heart cells. Mouse embryos heart-muscle cells burn sugar and lactic acid, but within 24 hours of birth, they shift to fatty acids as their fuel. After seven years of experiments, some of which involved milking mice by hand, researchers now zeroed in on -linolenic acid as a key compound that drives the switch, and identified the receptor and genes involved. Human breast milk also contains -linolenic acid, and a precursor is found in baby formula, although its unclear whether it has the same role in humans.

Nature | 4 min read

Go deeper with an analysis by heart development experts in the Nature News & Views article (6 min read, Nature paywall)

Reference: Nature paper

Chinas new data restrictions have strengthened privacy but are concerning researchers globally. The signal has been very clear that China does not want its scientists to collaborate as freely as they used to with foreigners, says sociologist Joy Zhang. Chinas largest academic database has partially suspended foreign access, and institutions that send, for example, clinical-trial data abroad must now undergo a security assessment. Unlike the European Unions data protection regulation, the law has no exemption for scientists. The Chinese government has also proposed adding CRISPR gene editing, crop breeding and photovoltaics techniques to its list of technologies whose export is prohibited or restricted.

Nature | 7 min read

Japans government is drawing fresh ire from researchers over plans to privatize the countrys influential science council (SCJ). The government has already backed away from plans to reform the councils constitution and its process for appointing members. Observers predict that the council will ultimately be forced to forge a new relationship with the government: I think the SCJ will have to find a way of existing as an organ within the government, while being independent, says policy researcher Hiroshi Nagano.

Nature | 5 min read

Even the scientists who have made quantum computers their lifes work say they cant do anything useful yet. Theyre all terrible, says physicist Winfried Hensinger of the five he owns (hes working on a new large-scale, modular type). But enthusiasts arent concerned and researchers say development is proceeding better than expected. The devices have the potential to accelerate drug discovery, crack encryption, speed up decision-making in financial transactions, improve machine learning, develop revolutionary materials and even address climate change and that barely scratches the surface, researchers say. The short-term hype is a bit high, says computational mathematician Steve Brierley, a founder of a quantum-computing firm. But the long-term hype is nowhere near enough.

Nature | 10 min read

Across Africa, 43% of people still do not have electricity and one of the causes is that highly indebted countries cant invest in research. Many countries find it impossible to pay off debts and protect public spending, which excludes them from expanding their scientific capabilities. Creditors should consider a debt-for-science swap, argues a Nature editorial: agree to waive some debt for countries that spend more on research.

Nature | 5 min read

The research system still tends to put power in the hands of just a handful of people. Universities should look to industry to learn how to better reflect how research is done today, argues a Nature editorial. (5 min read)

More:
Daily briefing: Quantum computers are all 'terrible' but researchers aren't worried - Nature.com

From self-driving cars to military surveillance: quantum computing can help secure the future of AI systems – The Conversation

Artificial intelligence algorithms are quickly becoming a part of everyday life. Many systems that require strong security are either already underpinned by machine learning or soon will be. These systems include facial recognition, banking, military targeting applications, and robots and autonomous vehicles, to name a few.

This raises an important question: how secure are these machine learning algorithms against malicious attacks?

In an article published today in Nature Machine Intelligence, my colleagues at the University of Melbourne and I discuss a potential solution to the vulnerability of machine learning models.

We propose that the integration of quantum computing in these models could yield new algorithms with strong resilience against adversarial attacks.

Machine learning algorithms can be remarkably accurate and efficient for many tasks. They are particularly useful for classifying and identifying image features. However, theyre also highly vulnerable to data manipulation attacks, which can pose serious security risks.

Data manipulation attacks which involve the very subtle manipulation of image data can be launched in several ways. An attack may be launched by mixing corrupt data into a training dataset used to train an algorithm, leading it to learn things it shouldnt.

Manipulated data can also be injected during the testing phase (after training is complete), in cases where the AI system continues to train the underlying algorithms while in use.

People can even carry out such attacks from the physical world. Someone could put a sticker on a stop sign to fool a self-driving cars AI into identifying it as a speed-limit sign. Or, on the front lines, troops might wear uniforms that can fool AI-based drones into identifying them as landscape features.

Read more: AI to Z: all the terms you need to know to keep up in the AI hype age

Either way, the consequences of data manipulation attacks can be severe. For example, if a self-driving car uses a machine learning algorithm that has been compromised, it may incorrectly predict there are no humans on the road when there are.

In our article, we describe how integrating quantum computing with machine learning could give rise to secure algorithms called quantum machine learning models.

These algorithms are carefully designed to exploit special quantum properties that would allow them to find specific patterns in image data that arent easily manipulated. The result would be resilient algorithms that are safe against even powerful attacks. They also wouldnt require the expensive adversarial training currently used to teach algorithms how to resist such attacks.

Beyond this, quantum machine learning could allow for faster algorithmic training and more accuracy in learning features.

Todays classical computers work by storing and processing information as bits, or binary digits, the smallest unit of data a computer can process. In classical computers, which follow the laws of classical physics, bits are represented as binary numbers specifically 0s and 1s.

Quantum computing, on the other hand, follows principles used in quantum physics. Information in quantum computers is stored and processed as qubits (quantum bits) which can exist as 0, 1, or a combination of both at once. A quantum system that exists in multiple states at once is said to be in a superposition state. Quantum computers can be used to design clever algorithms that exploit this property.

However, while there are significant potential benefits in using quantum computing to secure machine learning models, it could also be a double-edged sword.

On one hand, quantum machine learning models will provide critical security for many sensitive applications. On the other, quantum computers could be used to generate powerful adversarial attacks, capable of easily deceiving even state-of-the-art conventional machine learning models.

Moving forward, well need to seriously consider the best ways to protect our systems; an adversary with access to early quantum computers would pose a significant security threat.

The current evidence suggests were still some years away from quantum machine learning becoming a reality, due to limitations in the current generation of quantum processors.

Todays quantum computers are relatively small (with fewer than 500 qubits) and their error rates are high. Errors may arise for several reasons, including imperfect fabrication of qubits, errors in the control circuitry, or loss of information (called quantum decoherence) through interaction with the environment.

Still, weve seen enormous progress in quantum hardware and software over the past few years. According to recent quantum hardware roadmaps, its anticipated quantum devices made in coming years will have hundreds to thousands of qubits.

These devices should be able to run powerful quantum machine learning models to help protect a large range of industries that rely on machine learning and AI tools.

Worldwide, governments and private sectors alike are increasing their investment in quantum technologies.

This month the Australian government launched the National Quantum Strategy, aimed at growing the nations quantum industry and commercialising quantum technologies. According to the CSIRO, Australias quantum industry could be worth about A$2.2 billion by 2030.

Read more: Australia has a National Quantum Strategy. What does that mean?

Read more:
From self-driving cars to military surveillance: quantum computing can help secure the future of AI systems - The Conversation

3 Quantum Computing Stocks to Make You the Millionaire Next Door – InvestorPlace

With advanced digitalization technologies possibly standing poised to catalyze a seismic paradigm shift, investors should consider quantum computing stocks for millionaires (or at least those who aspire to be millionaires).As the AWS website under Amazon (NASDAQ:AMZN) states, quantum computing is a multidisciplinary field comprising aspects of computer science, physics, and mathematics that utilizes quantum mechanics to solve complex problems faster than on classical computers. Stated differently, the computer as we know is about to get a makeover, bolstering relevance for top quantum computing stocks.

According to Precedence Research, the global quantum computing market reached a valuation of $10.13 billion in 2022. However, experts project that by 2030, the sector could command a valuation of $125 billion. For speculators, its well worth considering millionaire quantum computing stock picks.To be sure, because of the novel market, investors should be prepared for wild unpredictability. Nevertheless, investing in quantum computing stocks could lead to life-changing returns.

Source: shutterstock.com/LCV

Initially, the inclusion of IBM (NYSE:IBM) as one of the quantum computing stocks for millionaires might not make sense. After all, this segment is supposed to be about high-risk, high-reward speculation. However, if youre interested in taking the more surefire, steady approach to being a millionaire, Big Blue makes plenty of sense.

Fundamentally, IBM belongs in the discussion for top quantum computing stocks thanks to its myriad innovations in the sector. For example, the company produced Qiskit Runtime, which is IBMs quantum computing service and programming model.

Financially, IBM carries a decent profile: nothing too positively remarkable but nothing too horrible either. Notably, IBM trades at a forward multiple of 13.44. As a discount to projected earnings, Big Blue ranks better than 79.51% of the competition. Also, its dividend yield comes out to 5.46%, which is quite generous.

Finally, Wall Street analysts peg IBM as a moderate buy. Their average price target lands at $147.38, implying nearly 16% upside potential.

Source: Boykov / Shutterstock.com

Based in Berkeley, California, Rigetti Computing (NASDAQ:RGTI) develops quantum integrated circuits used for quantum computers. Per its website, Rigetti specializes in fusing artificial intelligence and machine learning, thereby allowing the company to address the worlds most important and pressing problems. A wildly risky investment, RGTI only gained a bit more than half a percent in the year so far.

Most of that stems from shares popping up nearly 45% on the May 22 session, rising in sympathy with quantum computing rival D-Wave Quantum (NYSE:QBTS). If you follow the sector closely, youll know that QBTS skyrocketed almost 111% on Monday. However, plenty of traders dont want to overexpose themselves to extreme strength, making RGTI a potentially intriguing alternative for quantum computing stocks for millionaires.

To be sure, its financially a high-risk proposition. According to Zacks Equity Research, Rigetti came out with a quarterly loss of 19 cents for its first-quarter earnings report, missing the consensus estimate of a loss of 16 cents. On the positive side, Rigetti carries a relatively strong cash-to-debt ratio of 3.27.Still, despite some flaws, analysts peg RGTI as a consensus moderate buy. Their average price target clocks in at $1.25, implying over 70% upside potential.

Source: Bartlomiej K. Wroblewski / Shutterstock.com

For those that want to take their quantum computing stocks for millionaires to the extreme, Quantum Computing (NASDAQ:QUBT) may be what youre looking for. A full-stack quantum software and hardware company, Quantum seeks to accelerate the value of quantum computing for real-world business solutions, per its website. To help bring about this goal, the company bought out QPhoton, which specializes in quantum photonic systems (QPS).

Although fundamentally exciting, prospective investors must recognize that QUBT represents a high-risk, high-reward venture. In the trailing one-year period, for example, QUBT stumbled by more than 17%. Over the past five years, shares hemorrhaged 79% of equity value.

Financially, circumstances arent exactly confidence-building. In the first quarter of 2023, Quantum posted revenue of only $120,000. On the bottom line, it incurred a net loss of $8.51 million. Overall, the company features only middling fiscal stability, making it one of the riskier quantum computing stocks to buy.That said, Ascendiant analyst Edward Woo pegs QUBT as abuy. The expert forecasts a price target of $9.25, implying almost 612% upside potential.

On the date of publication, Josh Enomoto did not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

A former senior business analyst for Sony Electronics, Josh Enomoto has helped broker major contracts with Fortune Global 500 companies. Over the past several years, he has delivered unique, critical insights for the investment markets, as well as various other industries including legal, construction management, and healthcare.

Go here to read the rest:
3 Quantum Computing Stocks to Make You the Millionaire Next Door - InvestorPlace