Archive for the ‘Quantum Computer’ Category

Truman and Hruby 2022 fellows explore their p – EurekAlert

ALBUQUERQUE, N.M. Postdoctoral researchers who are designated Truman and Hruby fellows experience Sandia National Laboratories differently from their peers.

Appointees to the prestigious fellowships are given the latitude to pursue their own ideas, rather than being trained by fitting into the research plans of more experienced researchers. To give wings to this process, the four annual winners two for each category are 100 percent pre-funded for three years. This enables them, like bishops or knights in chess, to cut across financial barriers, walk into any group and participate in work by others that might help illuminate the research each has chosen to pursue.

The extraordinary appointments are named for former President Harry Truman and former Sandia President Jill Hruby, now the U.S. Department of Energy undersecretary for nuclear security and administrator of the National Nuclear Security Administration.

Truman wrote to the president of Bell Labs that he had an opportunity, in managing Sandia in its very earliest days, to perform exceptional service in the national interest. The President Harry S. Truman Fellowship in National Security Science and Engineering could be said to assert Sandias intention to continue to fulfill Trumans hope.

The Jill Hruby Fellowship in National Security Science and Engineering offers the same pay, benefits and privileges as the Truman. It honors former Sandia President Jill Hruby, the first woman to direct a national laboratory. While all qualified applicants will be considered for this fellowship, and its purpose is to pursue independent research to develop advanced technologies to ensure global peace, another aim is to develop a cadre of women in the engineering and science fields who are interested in technical leadership careers in national security.

The selectees are:

Alicia Magann: The quantum information science toolkit

To help speed the emergence of quantum computers as important research tools, Alicia Magann is working to create a quantum information science toolkit. These modeling and simulation algorithms should enable quantum researchers to hit the ground running with meaningful science as quantum computing hardware improves, she says.

Alicia Magann will explore the possibilities of quantum control in the era of quantum computing during her Truman fellowship at Sandia National Laboratories. (Photo courtesy Alicia Magann) Click on the thumbnail for a high-resolution image.

Her focus will extend aspects of her doctoral research at Princeton University to help explore the possibilities of quantum control in the era of quantum computing.

At Sandia, she will be working with Sandias quantum computer science department to develop algorithms for quantum computers that can be used to study the control of molecular systems.

Im most interested in probing how interactions between light and matter can be harnessed towards new science and technology, Magann said. How well can we control the behavior of complicated quantum systems by shining laser light on them? What kinds of interesting dynamics can we create, and what laser resources do we need?

A big problem, she says, is that its so difficult to explore these questions in much detail on conventional computers. But quantum computers would give us a much more natural setting for doing this computational exploration.

Her mentor, Mohan Sarovar, is an ideal mentor because hes knowledgeable about quantum control and quantum computing the two fields Im connecting with my project.

During her doctoral research, Magann was a DOE Computational Science Graduate Fellow and also served as a graduate intern in Sandias extreme-scale data science and analytics department, where she heard by word of mouth about the Truman and Hruby fellowships. She applied for both and was thrilled to be interviewed and thrilled to be awarded the Truman.

Technical journals in which her work has been published include Quantum, Physical Review A, Physical Review Research, PRX Quantum, and IEEE Transactions on Control Systems Technology. One of her most recent 2021 publications is Digital Quantum Simulation of Molecular Dynamics & Control in Physical Review Research.

Gabriel Shipley: Mitigating instabilities at Sandias Z machine

When people mentioned the idea to Gabe Shipley about applying for a Truman fellowship, he scoffed. He hadnt gone to an Ivy League school. He hadnt studied with Nobel laureates. What he had done, by the time he received his doctorate in electrical engineering from the University of New Mexico in 2021, was work at Sandia for eight years as an undergraduate student intern from 2013 and a graduate student intern since 2015. He wasnt sure that counted.

Gabriel Shipley, who broadened the use of a small pulsed power machine called Mykonos in a past internship, plans to investigate the origins and evolution of 3D instabilities in pulsed-power-driven implosions at Sandia National Laboratories powerful Z machine during his Truman fellowship. (Photo courtesy of Gabe Shipley) Click on the thumbnail for a high-resolution image.

The candidates for the Truman are rock stars, Shipley told colleague Paul Schmit. When they graduate, theyre offered tenure track positions at universities.

Schmit, himself a former Truman selectee and in this case a walking embodiment of positive reinforcement, advised, Dont sell yourself short.

That was good advice. Shipley needed to keep in mind that as a student, he led 75 shots on Mykonos, a relatively small Sandia pulsed power machine, significantly broadening its use. I was the first person to execute targeted physics experiments on Mykonos, he said. He measured magnetic field production using miniature magnetic field probes and optically diagnosed dielectric breakdown in the target.

He used the results to convince management to let him lead seven shots on Sandias premier Z machine, an expression of confidence rarely bestowed upon a student. I got amazing support from colleagues, he said. These are the best people in the world.

Among them is theoretical physicist Steve Slutz, who theorized that a magnetized target, preheated by a laser beam, would intensify the effect of Zs electrical pulse to produce record numbers of fusion reactions. Shipley has worked to come up with physical solutions that would best embody that theory.

With Sandia physicist Thomas Awe, he developed methods that may allow researchers to scrap external structures called Helmholtz coils to provide magnetic fields and instead create them using only an invented architecture that takes advantage of Zs own electrical current.

His Truman focus investigating the origins and evolution of 3D instabilities in pulsed-power-driven implosions would ameliorate a major problem with Z pinches if what he finds proves useful. Instabilities have been recognized since at least the 1950s as weakening pinch effectiveness. They currently limit the extent of compression and confinement achievable in the fusion fuel. Mitigating their effect would be a major achievement for everyone at Z and a major improvement for every researcher using those facilities.

Shipley has authored articles in the journal Physics of Plasmas and provided invited talks at the Annual Meeting of the APS Division of Plasma Physics and the 9th Fundamental Science with Pulsed Power: Research Opportunities and User Meeting. His most recent publication in Physics of Plasmas, Design of Dynamic Screw Pinch Experiments for Magnetized Liner Inertial Fusion, represents another attempt to increase Z machine output.

Sommer Johansen: Wheres the nitrogen?

Sommer Johansen received her doctorate in physical chemistry from the University of California, Davis, where her thesis involved going backward in time to explore the evolution of prebiotic molecules in the form of cyclic nitrogen compounds; her time machine consisted of combining laboratory spectroscopy and computational chemistry to learn how these molecules formed during the earliest stages of our solar system.

Sommer Johansen aims to improve models that demonstrate how burning bio-derived fuels affect the Earths planetary ecology and severe forest fires caused by climate change during her Hruby fellowship at Sandia National Laboratories. (Photo courtesy of Sommer Johansen) Click on the thumbnail for a high-resolution image.

Cyclic nitrogen-containing organic molecules are found on meteorites, but we have not directly detected them in space. So how were they formed and why havent we found where that happens? she asked.

That work, funded by a NASA Earth and Space Science Fellowship, formed the basis of publications in The Journal of Physical Chemistry and resulted in the inaugural Lewis E. Snyder Astrochemistry Award at the International Symposium on Molecular Spectroscopy. The work also was the subject of an invited talk she gave at the Harvard-Smithsonian Center for Astrophysics Stars & Planets Seminar in 2020.

At Sandia, she intends to come down to Earth, both literally and metaphorically, by experimenting at Sandias Combustion Research Facility in Livermore on projects of her own design.

She hopes to help improve comprehensive chemical kinetics models of the after-effects on Earths planetary ecology of burning bio-derived fuels and the increasingly severe forest fires caused by climate change.

Every time you burn something that was alive, nitrogen-containing species are released, she says. However, the chemical pathways of organic nitrogen-containing species are vastly under-represented in models of combustion and atmospheric chemistry, she says. We need highly accurate models to make accurate predictions. For example, right now it isnt clear how varying concentrations of different nitrogenated compounds within biofuels could affect efficiency and the emission of pollutants, she said.

Johansen will be working with the gas-phase chemical physics department, studying gas-phase nitrogen chemistry at Sandias Livermore site under the mentorship of Lenny Sheps and Judit Zdor. UC Davis is close to Livermore, and the Combustion Research Facility there was always in the back of my mind. I wanted to go there, use the best equipment in the world and work with some our fields smartest people.

She found particularly attractive that the Hruby fellowship not only encouraged winners to work on their own projects but also had a leadership and professional development component to help scientists become well-rounded. Johansen had already budgeted time outside lab work at UC Davis, where for five years she taught or helped assistants teach a workshop for incoming graduate students on the computer program Python. We had 30 people a year participating, until last year (when we went virtual) and had 150.

The program she initiated, she says, became a permanent fixture in my university.

Alex Downs: Long-lived wearable biosensors

As Alex Downs completed her doctorate at the University of California, Santa Barbara, in August 2021, she liked Sandia on LinkedIn. The Hruby postdoc listing happened to show up, she said, and it interested her. She wanted to create wearable biosensors for long duration, real-time molecular measurements of health markers that would be an ongoing measurement of a persons well-being. This would lessen the need to visit doctors offices and labs for evaluations that were not only expensive but might not register the full range of a persons illness.

Alex Downs hopes to create wearable biosensors that gather real-time molecular measurements from health markers and would lessen the need to visit doctors offices and labs for evaluations during her Hruby fellowship at Sandia National Laboratories. (Photo courtesy of Alex Downs) Click on the thumbnail for a high-resolution image.

Her thesis title was Electrochemical Methods for Improving Spatial Resolution, Temporal Resolution, and Signal Accuracy of Aptamer Biosensors.

She thought, Theres a huge opportunity here for freedom to explore my research interests. I can bring my expertise in electrochemistry and device fabrication and develop new skills working with microneedles and possibly other sensing platforms. That expertise is needed because a key problem with wearable biosensors is that in the body, they degrade. To address this, Downs wants to study the stability of different parts of the sensor interface when its exposed to bodily fluids, like blood.

I plan not only to make the sensors longer lasting by improved understanding of how the sensors are impacted by biofouling in media, I will also investigate replacing the monolayers used in the present sensor design with new, more fouling resistant monolayers, she said.

The recognition element for this type of biosensor are aptamers strands of DNA that bind specifically to a given target, such as a small molecule or protein. When you add a reporter to an aptamer sequence and put it down on a conductive surface, you can measure target binding to the sensor as a change in electrochemical signal, she said.

The work fits well with Sandias biological and chemical sensors team, and when Downs came to Sandia in October, she was welcomed with coffee and donuts from her mentor Ronen Polsky, an internationally recognized expert in wearable microneedle sensors. Polsky introduced her to other scientists, told her of related projects and discussed research ideas.

Right now, meeting with people all across the Labs has been helpful, she said. Later, I look forward to learning more about the Laboratory Directed Research and Development review process, going to Washington, D.C. and learning more about how science policy works. But right now, Im mainly focused on setting up a lab to do the initial experiments for developing microneedle aptamer-based sensors, Downs said.

Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energys National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

Sandia news media contact: Neal Singer, nsinger@sandia.gov, 505-977-7255

See the rest here:
Truman and Hruby 2022 fellows explore their p - EurekAlert

Behind the scenes at the Dietrich School’s machine shop – University of Pittsburgh

Take the collaboration between Strang and Assistant Professor Michael Hatridge in the Department of Physics and Astronomy. The two have been especially close partners in the Hatridge labs years-long effort to create more efficient quantum computers.

A lot of the things we need are weird enough that they dont exist as commercial objects, said Hatridge. Instead, he works with Strang to experiment with materials, finishings, machining techniques and binding substances to meet the exacting needs of the labs quantum computer. Those details have a direct influence on the final product, where temperatures are measured in nanokelvin and the computers operation in microseconds.

This is a collaboration. Its a conversation back and forth between us and the machine shop, said Hatridge.

And as the Hatridge lab breaks new ground in quantum computing, the shops machinists are alongside them learning about new materials and techniques to help those advances happen.

The shops portfolio also reaches far beyond the campuss physics labs. Artman once helped assemble an entire pontoon raft for geology researchers, and the groups past projects also include a skeleton key for the Allegheny Observatory and camera-filter mounts for volcano photography.

The flexibility and creativity required of the shops machinists means that Artman has his work cut out for him when trying to hire new machinists. Speaking of Strang and Tomaszewski, It takes a special person to do this, he said. Both of these guys could go out into industry and run entire businesses themselves.

But the same traits are what allow the team to contribute to cutting-edge Pitt research.

When Artmans work is part of a scientific breakthrough, he gets to tell his kids that he and his team are doing things that have never been done before. Now, his own daughter is a Pitt psychology major. And, after years of reading physics books his dad brought home, his 14-year-old son aspires to be a physicist.

No idea where he got that from, said Artman.

Patrick Monahan

See the rest here:
Behind the scenes at the Dietrich School's machine shop - University of Pittsburgh

Elderly care? Bring in the robots! – Modern Diplomacy

What is quantum computing? Why do we need quantum computing? According to Moores law (The complexity of a microcircuit, measured for example by the number of transistors per chip, doubles every 18 months and hence quadruples every 3 years), the density of transistors per area unit on a computing chip doubles every year and a half, which poses two main problems for traditional computers. Firstly, as to computation, high-density transistors will face the problem of power consumption and thermal effects. Secondly, the reduction in size will cause the failure of the classic theory of transistors and their performance will deviate from the original design.

Both of these problems will limit the further shrinkage of transistors, thus putting an end to Moores law. However, although the traditional computer develops until the end of Moores law, it is still unable to cope with many problems that need to be solved. Let us say we calculate the fundamental state energy of N coupled two-level systems, since the number of unknowns will be proportional to 2^N. The current simulation time required for IBMs supercomputer is 2.5 days for a specific computation on Googles 53-qubit quantum computer, which takes about 200 seconds. Qubit is the contraction of quantum bit, the term coined by Benjamin Schumacher to denote the quantum bit, i.e. the basic unit of quantum information.

As the number of qubits continues to increase, conventional computers will soon reach a bottleneck. However, almost all conventional computations involving quantum mechanics face the same problems. Hence many researchers started thinking about how to use the quantum properties themselves as computational resources as early as 1970, which was then summarised by Richard Feynman in 1982.

Hence what advantages do qubits have over traditional computing? The most surprising is none other than the properties of quantum superposition and quantum entanglement. Quantum superposition is a non-classical state that contrasts with empirical intuition and the metaphor is Schrdingers Cat that is both alive and dead.

The superposition state, however, is a real state for qubits on microscopic or mesoscopic scales (spatial scales, viewpoints and the like that are intermediate between macroscopic and microscopic scales). Qubits can be found in the superposition of two characteristic quantum states, and this superposition state is a non-classical state in which being and non-being coexist in the quantum world. In this state, the qubit is neither 0 nor 1, but it is not in a state in which both sides (0 and 1) are uncertain, but rather with equal probability, like a coin before it lands on the palm of the hand.

While in visible nature it is possible to observe a phenomenon without perceptibly influencing it by observation alone (i.e. only by looking at the said phenomenon) in atomic physics and quantum mechanics, a finite and up to a certain point invisible perturbation is connected to every observation. The uncertainty principle is the recognition of absolute chance and arbitrariness in natural phenomena. On the other hand, as will become clear later, quantum mechanics does not predict a single, well-defined result for the observation or for any observer.

The fact that qubits can undergo quantum evolution in a set of superposition states which is neither 0 nor 1 implies quantum parallelism in the relevant computation. The evolution of each qubit, however, is not sufficient to construct all possible evolutions of a multi-qubit system. We must therefore

also interact with different qubits so that they can be intertwined in order to construct a satisfactory algorithm for such a computation. This special superposition is precisely called entangled quantum state.

Let us take two qubits as an example, which is a typical entangled state. Between them, the state representing the first qubit is connected to the state of the second qubit. The two connections are in quantum superposition and we cannot therefore talk about the state in which the two qubits are at that moment hence we talk about entanglement.

There is a more practical view of entanglement in quantum computing, i.e. entangled states usually arise from the control of one qubit (control qubit) over another (target qubit). The relationship between the control qubit and the target qubit is similar to the aforementioned Schrdingers Cat. According to this view, if the controlling part is in a state of superposition, the controlled part will be in a superposition of different controlled situations.

This entanglement process is an important element in quantum computing. We can say that superposition and entanglement synergistically weave the varied parallel evolution of quantum computing. Each measurement can only compute one of the possible states, and the superposition state no longer exists after the first measurement. Hence, with a view to obtaining the statistical information we need in the superposition state, we have to compute and measure results again.

Therefore, in many quantum algorithms (such as the Shors algorithm for factoring [which solves the problem of factor decomposition of integer numbers into primes] and digital quantum simulation), we need to use some interference mechanisms after the computation, so that the information of that phase containing the response in the superposition state is converted into conservation (with the implicit idea of preventing a final spill or loss) due to constructive interference (i.e. by the immediately following variation of other data produced), while further data is eliminated by destructive interference. In this way, the response can be obtained with fewer measurements. Most quantum algorithms rely heavily on the phenomenon of fluctuation and interference hence the relative phase is very important for quantum computing, which is called quantum coherence. In the hardware design of quantum computers, many considerations are related to how to protect the quantum state to prolong the coherence lifetime.

Quantum computers have a variety of hardware implementations, but the design considerations are similar. There are three common considerations: qubit operability, measurability, and protection of quantum states. In response to these considerations, a cavity quantum electrodynamics (cQED) system has been developed. A superconducting quantum system can be taken as an example to introduce the implementation of quantum computers. The difference in frequency between the resonant cavity and the qubit means that the coupling between the resonant cavity and the qubit tends not to exchange energy quanta, but only to generate entanglement, which means that the frequency of the resonant cavity will shift with the state of the qubit. Hence the state of the qubit can be deduced by measuring the microwave penetration or reflection spectrum near the resonant frequency with the bit readout line.

The entanglement mechanism between adjacent qubits is provided by the coupling relative to the electrical capacitance between cross-type capacitors. The coupling effect is controlled by the frequency difference between adjacent qubits. The oscillating behaviour reflects the quantum interference effect and its gradual disappearance leads to the decay of coherence and quantum energy.

The coherent lifetime of qubits is influenced by two factors, an intrinsic and an extrinsic one. The extrinsic influence comes mainly from the coupling between the qubit and the quantum state readout circuit. The presence of a filter-like protection mechanism in the microwave cavity between the bit and the readout line can provide a qubit-like protection mechanism because the cavity and the qubit have a frequency difference of about 718 MHz. The intrinsic influence comes mainly from the loss of the qubit itself and the sensitivity of its frequency to various types of noise, which can usually be suppressed by improved materials and processes and optimisation of the geometric structure.

Quantum computing has a wide range of applications, currently involved in the fields of decryption and cryptography, quantum chemistry, quantum physics, optimisation problems and artificial intelligence. This covers almost all aspects of human society and will have a significant impact on human life after practice. However, the best quantum computers are not yet able to express the advantages of quantum computing. Although the number of qubits on a quantum computer has exceeded 50, the circuit depth required to run the algorithm is far from sufficient. The main reason is that the error rate of qubits in the computation process is still very high, even though we can use quantum correction of qubits and fault-tolerant quantum computation. In the case of quantum computing, the accuracy which gradually improves data will greatly increase the difficulty of producing the hardware and the complexity of the algorithm. At present, the implementation of some well-known algorithms has only reached the level of conceptual demonstration, which is sufficient to demonstrate the feasibility of quantum computing, but practical application still has a long way to go.

But we should remain optimistic because, although general quantum computation still needs to be improved by quantum computer hardware, we can still find new algorithms and applications. Moreover, the development of hardware can also make great strides, just like the development of traditional computers in the beginning. In line with this goal, many existing technological industries could be upgraded in the near future. Research is running fast thanks also to significant public and private investment, and the first commercial applications will be seen in the short term.

Considering defence and intelligence issues, many governments are funding research in this area. The Peoples Republic of China and the United States of America have launched multi-year plans worth billions of yuan and dollars. The European Union has also established the Quantum Flagship Programme for an investment of one billion euros.

Related

Read more here:
Elderly care? Bring in the robots! - Modern Diplomacy

The War of Narratives Over the Origin of COVID-19 and our Virology-Media-Complex – Modern Diplomacy

What is quantum computing? Why do we need quantum computing? According to Moores law (The complexity of a microcircuit, measured for example by the number of transistors per chip, doubles every 18 months and hence quadruples every 3 years), the density of transistors per area unit on a computing chip doubles every year and a half, which poses two main problems for traditional computers. Firstly, as to computation, high-density transistors will face the problem of power consumption and thermal effects. Secondly, the reduction in size will cause the failure of the classic theory of transistors and their performance will deviate from the original design.

Both of these problems will limit the further shrinkage of transistors, thus putting an end to Moores law. However, although the traditional computer develops until the end of Moores law, it is still unable to cope with many problems that need to be solved. Let us say we calculate the fundamental state energy of N coupled two-level systems, since the number of unknowns will be proportional to 2^N. The current simulation time required for IBMs supercomputer is 2.5 days for a specific computation on Googles 53-qubit quantum computer, which takes about 200 seconds. Qubit is the contraction of quantum bit, the term coined by Benjamin Schumacher to denote the quantum bit, i.e. the basic unit of quantum information.

As the number of qubits continues to increase, conventional computers will soon reach a bottleneck. However, almost all conventional computations involving quantum mechanics face the same problems. Hence many researchers started thinking about how to use the quantum properties themselves as computational resources as early as 1970, which was then summarised by Richard Feynman in 1982.

Hence what advantages do qubits have over traditional computing? The most surprising is none other than the properties of quantum superposition and quantum entanglement. Quantum superposition is a non-classical state that contrasts with empirical intuition and the metaphor is Schrdingers Cat that is both alive and dead.

The superposition state, however, is a real state for qubits on microscopic or mesoscopic scales (spatial scales, viewpoints and the like that are intermediate between macroscopic and microscopic scales). Qubits can be found in the superposition of two characteristic quantum states, and this superposition state is a non-classical state in which being and non-being coexist in the quantum world. In this state, the qubit is neither 0 nor 1, but it is not in a state in which both sides (0 and 1) are uncertain, but rather with equal probability, like a coin before it lands on the palm of the hand.

While in visible nature it is possible to observe a phenomenon without perceptibly influencing it by observation alone (i.e. only by looking at the said phenomenon) in atomic physics and quantum mechanics, a finite and up to a certain point invisible perturbation is connected to every observation. The uncertainty principle is the recognition of absolute chance and arbitrariness in natural phenomena. On the other hand, as will become clear later, quantum mechanics does not predict a single, well-defined result for the observation or for any observer.

The fact that qubits can undergo quantum evolution in a set of superposition states which is neither 0 nor 1 implies quantum parallelism in the relevant computation. The evolution of each qubit, however, is not sufficient to construct all possible evolutions of a multi-qubit system. We must therefore

also interact with different qubits so that they can be intertwined in order to construct a satisfactory algorithm for such a computation. This special superposition is precisely called entangled quantum state.

Let us take two qubits as an example, which is a typical entangled state. Between them, the state representing the first qubit is connected to the state of the second qubit. The two connections are in quantum superposition and we cannot therefore talk about the state in which the two qubits are at that moment hence we talk about entanglement.

There is a more practical view of entanglement in quantum computing, i.e. entangled states usually arise from the control of one qubit (control qubit) over another (target qubit). The relationship between the control qubit and the target qubit is similar to the aforementioned Schrdingers Cat. According to this view, if the controlling part is in a state of superposition, the controlled part will be in a superposition of different controlled situations.

This entanglement process is an important element in quantum computing. We can say that superposition and entanglement synergistically weave the varied parallel evolution of quantum computing. Each measurement can only compute one of the possible states, and the superposition state no longer exists after the first measurement. Hence, with a view to obtaining the statistical information we need in the superposition state, we have to compute and measure results again.

Therefore, in many quantum algorithms (such as the Shors algorithm for factoring [which solves the problem of factor decomposition of integer numbers into primes] and digital quantum simulation), we need to use some interference mechanisms after the computation, so that the information of that phase containing the response in the superposition state is converted into conservation (with the implicit idea of preventing a final spill or loss) due to constructive interference (i.e. by the immediately following variation of other data produced), while further data is eliminated by destructive interference. In this way, the response can be obtained with fewer measurements. Most quantum algorithms rely heavily on the phenomenon of fluctuation and interference hence the relative phase is very important for quantum computing, which is called quantum coherence. In the hardware design of quantum computers, many considerations are related to how to protect the quantum state to prolong the coherence lifetime.

Quantum computers have a variety of hardware implementations, but the design considerations are similar. There are three common considerations: qubit operability, measurability, and protection of quantum states. In response to these considerations, a cavity quantum electrodynamics (cQED) system has been developed. A superconducting quantum system can be taken as an example to introduce the implementation of quantum computers. The difference in frequency between the resonant cavity and the qubit means that the coupling between the resonant cavity and the qubit tends not to exchange energy quanta, but only to generate entanglement, which means that the frequency of the resonant cavity will shift with the state of the qubit. Hence the state of the qubit can be deduced by measuring the microwave penetration or reflection spectrum near the resonant frequency with the bit readout line.

The entanglement mechanism between adjacent qubits is provided by the coupling relative to the electrical capacitance between cross-type capacitors. The coupling effect is controlled by the frequency difference between adjacent qubits. The oscillating behaviour reflects the quantum interference effect and its gradual disappearance leads to the decay of coherence and quantum energy.

The coherent lifetime of qubits is influenced by two factors, an intrinsic and an extrinsic one. The extrinsic influence comes mainly from the coupling between the qubit and the quantum state readout circuit. The presence of a filter-like protection mechanism in the microwave cavity between the bit and the readout line can provide a qubit-like protection mechanism because the cavity and the qubit have a frequency difference of about 718 MHz. The intrinsic influence comes mainly from the loss of the qubit itself and the sensitivity of its frequency to various types of noise, which can usually be suppressed by improved materials and processes and optimisation of the geometric structure.

Quantum computing has a wide range of applications, currently involved in the fields of decryption and cryptography, quantum chemistry, quantum physics, optimisation problems and artificial intelligence. This covers almost all aspects of human society and will have a significant impact on human life after practice. However, the best quantum computers are not yet able to express the advantages of quantum computing. Although the number of qubits on a quantum computer has exceeded 50, the circuit depth required to run the algorithm is far from sufficient. The main reason is that the error rate of qubits in the computation process is still very high, even though we can use quantum correction of qubits and fault-tolerant quantum computation. In the case of quantum computing, the accuracy which gradually improves data will greatly increase the difficulty of producing the hardware and the complexity of the algorithm. At present, the implementation of some well-known algorithms has only reached the level of conceptual demonstration, which is sufficient to demonstrate the feasibility of quantum computing, but practical application still has a long way to go.

But we should remain optimistic because, although general quantum computation still needs to be improved by quantum computer hardware, we can still find new algorithms and applications. Moreover, the development of hardware can also make great strides, just like the development of traditional computers in the beginning. In line with this goal, many existing technological industries could be upgraded in the near future. Research is running fast thanks also to significant public and private investment, and the first commercial applications will be seen in the short term.

Considering defence and intelligence issues, many governments are funding research in this area. The Peoples Republic of China and the United States of America have launched multi-year plans worth billions of yuan and dollars. The European Union has also established the Quantum Flagship Programme for an investment of one billion euros.

Related

View post:
The War of Narratives Over the Origin of COVID-19 and our Virology-Media-Complex - Modern Diplomacy

JavaScript library updated to wipe files from Russian computers – The Register

The developer of JavaScript library node-ipc, which is used by the popular vue.js framework, deliberately introduced a critical security vulnerability that, for some netizens, would destroy their computers' files.

Brandon Nozaki Miller, aka RIAEvangelist on GitHub, created node-ipc, which is fetched about a million times a week from the NPM registry, and is described as an "inter-process communication module for Node, supporting Unix sockets, TCP, TLS, and UDP."

It appears Miller intentionally changed his code to overwrite the host system's data, then changed the code to display a message calling for world peace, as a protest against Russia's invasion of Ukraine. GitHub on Wednesday declared this a critical vulnerability tracked as CVE-2022-23812.

"The malicious code was intended to overwrite arbitrary files dependent upon the geo-location of the user IP address," the Microsoft-owned biz said.

Between March 7 and March 8, versions 10.1.1 and 10.1.2 of the library were released. When imported as a dependency and run by a project, these checked if the host machine had an IP address in Russia or Belarus, and if so, overwrote every file it could with a heart symbol. Version 10.1.3 was released soon after without this destructive functionality; 10.1.1 and 10.1.2 were removed from the NPM registry.

Version 11 was then published, and the following week version 9.2.2. Both brought in a new package by Miller called peacenotwar, which creates files called WITH-LOVE-FROM-AMERICA.txt in users' desktop and OneDrive folders. This text file is supposed to contain a message from the developer stating among other things, "war is not the answer, no matter how bad it is," though some folks reported the file was empty.

Whenever node-ipc versions 11 or 9.2.2 are used as a dependency by another project, they bring in peacenotwar and run it, leaving files on people's computers. Version 9.2.2 has disappeared from the NPM registry along with the destructive 10.1.x versions. Vue.js, for one, brought in node-ipc 9.2.2 while it was available, as 9.x is considered a stable branch, meaning there was a period in which some Vue developers may have had .txt files show up unexpectedly.

In other words, not too many people fetched the destructive version, as big apps and frameworks will have used the stable branch, which for a short while dropped .txt files. Anyone using bleeding-edge versions may have had their files vanished, or found manifestos saved to their computers.

A timeline of events has been documented by infosec outfit Snyk. We note that the landing page for the node-ipc module on NPM states "as of v11 this module uses the peacenotwar module."

Miller has defended his peacenotwar module on GitHub, saying "this is all public, documented, licensed and open source." Earlier, there were more than 20 issues flagged against node-ipc about its bad behavior, and just now plenty more over on peacenotwar.

Some of the comments referred to Miller's creation as "protestware." Others might call it malware. The programmer was not available for comment.

Someone even claimed an American NGO had their production files on one system trashed by node-ipc as they were running the library on a monitoring server in Belarus with an IP address that triggered the data-wiping code.

The continuing rise of the Node.js JavaScript framework has given the world a whole new type of software vulnerability.

Node's package manager is NPM, which is overseen and owned these days by GitHub along with NPM's registry of modules. This tool makes it easy for Node apps to automatically pull in other libraries of code directly from online repositories. This results in vast numbers of downloads for many modules, meaning that small code changes can propagate very rapidly across large numbers of computers.

The file-dropping version of node-ipc got sucked into version 3.1 of Unity Hub, a tool for the extremely popular Unity games engine although it was removed the same day.

"This hot-fix eliminates an issue where a third-party library was able to create an empty text file on the desktop of people using this release version," the Unity team wrote. "While it was a nuisance, the issue did not include malicious functionality. Any user that had this file appear on their desktop after updating the Unity Hub can delete this file."

This is far from the first time something like this has happened. In 2016, a developer removed his tiny leftpad library from NPM, breaking thousands of other apps. Earlier this year, another developer added a breaking change to his library as a protest.

Infosec firm WhiteSource said earlier this year it detected in 2021 1,300 malicious npm packages. It reported them to npm, which quietly removed them.

The rest is here:
JavaScript library updated to wipe files from Russian computers - The Register