Archive for the ‘Quantum Computer’ Category

Elderly care? Bring in the robots! – Modern Diplomacy

What is quantum computing? Why do we need quantum computing? According to Moores law (The complexity of a microcircuit, measured for example by the number of transistors per chip, doubles every 18 months and hence quadruples every 3 years), the density of transistors per area unit on a computing chip doubles every year and a half, which poses two main problems for traditional computers. Firstly, as to computation, high-density transistors will face the problem of power consumption and thermal effects. Secondly, the reduction in size will cause the failure of the classic theory of transistors and their performance will deviate from the original design.

Both of these problems will limit the further shrinkage of transistors, thus putting an end to Moores law. However, although the traditional computer develops until the end of Moores law, it is still unable to cope with many problems that need to be solved. Let us say we calculate the fundamental state energy of N coupled two-level systems, since the number of unknowns will be proportional to 2^N. The current simulation time required for IBMs supercomputer is 2.5 days for a specific computation on Googles 53-qubit quantum computer, which takes about 200 seconds. Qubit is the contraction of quantum bit, the term coined by Benjamin Schumacher to denote the quantum bit, i.e. the basic unit of quantum information.

As the number of qubits continues to increase, conventional computers will soon reach a bottleneck. However, almost all conventional computations involving quantum mechanics face the same problems. Hence many researchers started thinking about how to use the quantum properties themselves as computational resources as early as 1970, which was then summarised by Richard Feynman in 1982.

Hence what advantages do qubits have over traditional computing? The most surprising is none other than the properties of quantum superposition and quantum entanglement. Quantum superposition is a non-classical state that contrasts with empirical intuition and the metaphor is Schrdingers Cat that is both alive and dead.

The superposition state, however, is a real state for qubits on microscopic or mesoscopic scales (spatial scales, viewpoints and the like that are intermediate between macroscopic and microscopic scales). Qubits can be found in the superposition of two characteristic quantum states, and this superposition state is a non-classical state in which being and non-being coexist in the quantum world. In this state, the qubit is neither 0 nor 1, but it is not in a state in which both sides (0 and 1) are uncertain, but rather with equal probability, like a coin before it lands on the palm of the hand.

While in visible nature it is possible to observe a phenomenon without perceptibly influencing it by observation alone (i.e. only by looking at the said phenomenon) in atomic physics and quantum mechanics, a finite and up to a certain point invisible perturbation is connected to every observation. The uncertainty principle is the recognition of absolute chance and arbitrariness in natural phenomena. On the other hand, as will become clear later, quantum mechanics does not predict a single, well-defined result for the observation or for any observer.

The fact that qubits can undergo quantum evolution in a set of superposition states which is neither 0 nor 1 implies quantum parallelism in the relevant computation. The evolution of each qubit, however, is not sufficient to construct all possible evolutions of a multi-qubit system. We must therefore

also interact with different qubits so that they can be intertwined in order to construct a satisfactory algorithm for such a computation. This special superposition is precisely called entangled quantum state.

Let us take two qubits as an example, which is a typical entangled state. Between them, the state representing the first qubit is connected to the state of the second qubit. The two connections are in quantum superposition and we cannot therefore talk about the state in which the two qubits are at that moment hence we talk about entanglement.

There is a more practical view of entanglement in quantum computing, i.e. entangled states usually arise from the control of one qubit (control qubit) over another (target qubit). The relationship between the control qubit and the target qubit is similar to the aforementioned Schrdingers Cat. According to this view, if the controlling part is in a state of superposition, the controlled part will be in a superposition of different controlled situations.

This entanglement process is an important element in quantum computing. We can say that superposition and entanglement synergistically weave the varied parallel evolution of quantum computing. Each measurement can only compute one of the possible states, and the superposition state no longer exists after the first measurement. Hence, with a view to obtaining the statistical information we need in the superposition state, we have to compute and measure results again.

Therefore, in many quantum algorithms (such as the Shors algorithm for factoring [which solves the problem of factor decomposition of integer numbers into primes] and digital quantum simulation), we need to use some interference mechanisms after the computation, so that the information of that phase containing the response in the superposition state is converted into conservation (with the implicit idea of preventing a final spill or loss) due to constructive interference (i.e. by the immediately following variation of other data produced), while further data is eliminated by destructive interference. In this way, the response can be obtained with fewer measurements. Most quantum algorithms rely heavily on the phenomenon of fluctuation and interference hence the relative phase is very important for quantum computing, which is called quantum coherence. In the hardware design of quantum computers, many considerations are related to how to protect the quantum state to prolong the coherence lifetime.

Quantum computers have a variety of hardware implementations, but the design considerations are similar. There are three common considerations: qubit operability, measurability, and protection of quantum states. In response to these considerations, a cavity quantum electrodynamics (cQED) system has been developed. A superconducting quantum system can be taken as an example to introduce the implementation of quantum computers. The difference in frequency between the resonant cavity and the qubit means that the coupling between the resonant cavity and the qubit tends not to exchange energy quanta, but only to generate entanglement, which means that the frequency of the resonant cavity will shift with the state of the qubit. Hence the state of the qubit can be deduced by measuring the microwave penetration or reflection spectrum near the resonant frequency with the bit readout line.

The entanglement mechanism between adjacent qubits is provided by the coupling relative to the electrical capacitance between cross-type capacitors. The coupling effect is controlled by the frequency difference between adjacent qubits. The oscillating behaviour reflects the quantum interference effect and its gradual disappearance leads to the decay of coherence and quantum energy.

The coherent lifetime of qubits is influenced by two factors, an intrinsic and an extrinsic one. The extrinsic influence comes mainly from the coupling between the qubit and the quantum state readout circuit. The presence of a filter-like protection mechanism in the microwave cavity between the bit and the readout line can provide a qubit-like protection mechanism because the cavity and the qubit have a frequency difference of about 718 MHz. The intrinsic influence comes mainly from the loss of the qubit itself and the sensitivity of its frequency to various types of noise, which can usually be suppressed by improved materials and processes and optimisation of the geometric structure.

Quantum computing has a wide range of applications, currently involved in the fields of decryption and cryptography, quantum chemistry, quantum physics, optimisation problems and artificial intelligence. This covers almost all aspects of human society and will have a significant impact on human life after practice. However, the best quantum computers are not yet able to express the advantages of quantum computing. Although the number of qubits on a quantum computer has exceeded 50, the circuit depth required to run the algorithm is far from sufficient. The main reason is that the error rate of qubits in the computation process is still very high, even though we can use quantum correction of qubits and fault-tolerant quantum computation. In the case of quantum computing, the accuracy which gradually improves data will greatly increase the difficulty of producing the hardware and the complexity of the algorithm. At present, the implementation of some well-known algorithms has only reached the level of conceptual demonstration, which is sufficient to demonstrate the feasibility of quantum computing, but practical application still has a long way to go.

But we should remain optimistic because, although general quantum computation still needs to be improved by quantum computer hardware, we can still find new algorithms and applications. Moreover, the development of hardware can also make great strides, just like the development of traditional computers in the beginning. In line with this goal, many existing technological industries could be upgraded in the near future. Research is running fast thanks also to significant public and private investment, and the first commercial applications will be seen in the short term.

Considering defence and intelligence issues, many governments are funding research in this area. The Peoples Republic of China and the United States of America have launched multi-year plans worth billions of yuan and dollars. The European Union has also established the Quantum Flagship Programme for an investment of one billion euros.

Related

Read more here:
Elderly care? Bring in the robots! - Modern Diplomacy

The War of Narratives Over the Origin of COVID-19 and our Virology-Media-Complex – Modern Diplomacy

What is quantum computing? Why do we need quantum computing? According to Moores law (The complexity of a microcircuit, measured for example by the number of transistors per chip, doubles every 18 months and hence quadruples every 3 years), the density of transistors per area unit on a computing chip doubles every year and a half, which poses two main problems for traditional computers. Firstly, as to computation, high-density transistors will face the problem of power consumption and thermal effects. Secondly, the reduction in size will cause the failure of the classic theory of transistors and their performance will deviate from the original design.

Both of these problems will limit the further shrinkage of transistors, thus putting an end to Moores law. However, although the traditional computer develops until the end of Moores law, it is still unable to cope with many problems that need to be solved. Let us say we calculate the fundamental state energy of N coupled two-level systems, since the number of unknowns will be proportional to 2^N. The current simulation time required for IBMs supercomputer is 2.5 days for a specific computation on Googles 53-qubit quantum computer, which takes about 200 seconds. Qubit is the contraction of quantum bit, the term coined by Benjamin Schumacher to denote the quantum bit, i.e. the basic unit of quantum information.

As the number of qubits continues to increase, conventional computers will soon reach a bottleneck. However, almost all conventional computations involving quantum mechanics face the same problems. Hence many researchers started thinking about how to use the quantum properties themselves as computational resources as early as 1970, which was then summarised by Richard Feynman in 1982.

Hence what advantages do qubits have over traditional computing? The most surprising is none other than the properties of quantum superposition and quantum entanglement. Quantum superposition is a non-classical state that contrasts with empirical intuition and the metaphor is Schrdingers Cat that is both alive and dead.

The superposition state, however, is a real state for qubits on microscopic or mesoscopic scales (spatial scales, viewpoints and the like that are intermediate between macroscopic and microscopic scales). Qubits can be found in the superposition of two characteristic quantum states, and this superposition state is a non-classical state in which being and non-being coexist in the quantum world. In this state, the qubit is neither 0 nor 1, but it is not in a state in which both sides (0 and 1) are uncertain, but rather with equal probability, like a coin before it lands on the palm of the hand.

While in visible nature it is possible to observe a phenomenon without perceptibly influencing it by observation alone (i.e. only by looking at the said phenomenon) in atomic physics and quantum mechanics, a finite and up to a certain point invisible perturbation is connected to every observation. The uncertainty principle is the recognition of absolute chance and arbitrariness in natural phenomena. On the other hand, as will become clear later, quantum mechanics does not predict a single, well-defined result for the observation or for any observer.

The fact that qubits can undergo quantum evolution in a set of superposition states which is neither 0 nor 1 implies quantum parallelism in the relevant computation. The evolution of each qubit, however, is not sufficient to construct all possible evolutions of a multi-qubit system. We must therefore

also interact with different qubits so that they can be intertwined in order to construct a satisfactory algorithm for such a computation. This special superposition is precisely called entangled quantum state.

Let us take two qubits as an example, which is a typical entangled state. Between them, the state representing the first qubit is connected to the state of the second qubit. The two connections are in quantum superposition and we cannot therefore talk about the state in which the two qubits are at that moment hence we talk about entanglement.

There is a more practical view of entanglement in quantum computing, i.e. entangled states usually arise from the control of one qubit (control qubit) over another (target qubit). The relationship between the control qubit and the target qubit is similar to the aforementioned Schrdingers Cat. According to this view, if the controlling part is in a state of superposition, the controlled part will be in a superposition of different controlled situations.

This entanglement process is an important element in quantum computing. We can say that superposition and entanglement synergistically weave the varied parallel evolution of quantum computing. Each measurement can only compute one of the possible states, and the superposition state no longer exists after the first measurement. Hence, with a view to obtaining the statistical information we need in the superposition state, we have to compute and measure results again.

Therefore, in many quantum algorithms (such as the Shors algorithm for factoring [which solves the problem of factor decomposition of integer numbers into primes] and digital quantum simulation), we need to use some interference mechanisms after the computation, so that the information of that phase containing the response in the superposition state is converted into conservation (with the implicit idea of preventing a final spill or loss) due to constructive interference (i.e. by the immediately following variation of other data produced), while further data is eliminated by destructive interference. In this way, the response can be obtained with fewer measurements. Most quantum algorithms rely heavily on the phenomenon of fluctuation and interference hence the relative phase is very important for quantum computing, which is called quantum coherence. In the hardware design of quantum computers, many considerations are related to how to protect the quantum state to prolong the coherence lifetime.

Quantum computers have a variety of hardware implementations, but the design considerations are similar. There are three common considerations: qubit operability, measurability, and protection of quantum states. In response to these considerations, a cavity quantum electrodynamics (cQED) system has been developed. A superconducting quantum system can be taken as an example to introduce the implementation of quantum computers. The difference in frequency between the resonant cavity and the qubit means that the coupling between the resonant cavity and the qubit tends not to exchange energy quanta, but only to generate entanglement, which means that the frequency of the resonant cavity will shift with the state of the qubit. Hence the state of the qubit can be deduced by measuring the microwave penetration or reflection spectrum near the resonant frequency with the bit readout line.

The entanglement mechanism between adjacent qubits is provided by the coupling relative to the electrical capacitance between cross-type capacitors. The coupling effect is controlled by the frequency difference between adjacent qubits. The oscillating behaviour reflects the quantum interference effect and its gradual disappearance leads to the decay of coherence and quantum energy.

The coherent lifetime of qubits is influenced by two factors, an intrinsic and an extrinsic one. The extrinsic influence comes mainly from the coupling between the qubit and the quantum state readout circuit. The presence of a filter-like protection mechanism in the microwave cavity between the bit and the readout line can provide a qubit-like protection mechanism because the cavity and the qubit have a frequency difference of about 718 MHz. The intrinsic influence comes mainly from the loss of the qubit itself and the sensitivity of its frequency to various types of noise, which can usually be suppressed by improved materials and processes and optimisation of the geometric structure.

Quantum computing has a wide range of applications, currently involved in the fields of decryption and cryptography, quantum chemistry, quantum physics, optimisation problems and artificial intelligence. This covers almost all aspects of human society and will have a significant impact on human life after practice. However, the best quantum computers are not yet able to express the advantages of quantum computing. Although the number of qubits on a quantum computer has exceeded 50, the circuit depth required to run the algorithm is far from sufficient. The main reason is that the error rate of qubits in the computation process is still very high, even though we can use quantum correction of qubits and fault-tolerant quantum computation. In the case of quantum computing, the accuracy which gradually improves data will greatly increase the difficulty of producing the hardware and the complexity of the algorithm. At present, the implementation of some well-known algorithms has only reached the level of conceptual demonstration, which is sufficient to demonstrate the feasibility of quantum computing, but practical application still has a long way to go.

But we should remain optimistic because, although general quantum computation still needs to be improved by quantum computer hardware, we can still find new algorithms and applications. Moreover, the development of hardware can also make great strides, just like the development of traditional computers in the beginning. In line with this goal, many existing technological industries could be upgraded in the near future. Research is running fast thanks also to significant public and private investment, and the first commercial applications will be seen in the short term.

Considering defence and intelligence issues, many governments are funding research in this area. The Peoples Republic of China and the United States of America have launched multi-year plans worth billions of yuan and dollars. The European Union has also established the Quantum Flagship Programme for an investment of one billion euros.

Related

View post:
The War of Narratives Over the Origin of COVID-19 and our Virology-Media-Complex - Modern Diplomacy

JavaScript library updated to wipe files from Russian computers – The Register

The developer of JavaScript library node-ipc, which is used by the popular vue.js framework, deliberately introduced a critical security vulnerability that, for some netizens, would destroy their computers' files.

Brandon Nozaki Miller, aka RIAEvangelist on GitHub, created node-ipc, which is fetched about a million times a week from the NPM registry, and is described as an "inter-process communication module for Node, supporting Unix sockets, TCP, TLS, and UDP."

It appears Miller intentionally changed his code to overwrite the host system's data, then changed the code to display a message calling for world peace, as a protest against Russia's invasion of Ukraine. GitHub on Wednesday declared this a critical vulnerability tracked as CVE-2022-23812.

"The malicious code was intended to overwrite arbitrary files dependent upon the geo-location of the user IP address," the Microsoft-owned biz said.

Between March 7 and March 8, versions 10.1.1 and 10.1.2 of the library were released. When imported as a dependency and run by a project, these checked if the host machine had an IP address in Russia or Belarus, and if so, overwrote every file it could with a heart symbol. Version 10.1.3 was released soon after without this destructive functionality; 10.1.1 and 10.1.2 were removed from the NPM registry.

Version 11 was then published, and the following week version 9.2.2. Both brought in a new package by Miller called peacenotwar, which creates files called WITH-LOVE-FROM-AMERICA.txt in users' desktop and OneDrive folders. This text file is supposed to contain a message from the developer stating among other things, "war is not the answer, no matter how bad it is," though some folks reported the file was empty.

Whenever node-ipc versions 11 or 9.2.2 are used as a dependency by another project, they bring in peacenotwar and run it, leaving files on people's computers. Version 9.2.2 has disappeared from the NPM registry along with the destructive 10.1.x versions. Vue.js, for one, brought in node-ipc 9.2.2 while it was available, as 9.x is considered a stable branch, meaning there was a period in which some Vue developers may have had .txt files show up unexpectedly.

In other words, not too many people fetched the destructive version, as big apps and frameworks will have used the stable branch, which for a short while dropped .txt files. Anyone using bleeding-edge versions may have had their files vanished, or found manifestos saved to their computers.

A timeline of events has been documented by infosec outfit Snyk. We note that the landing page for the node-ipc module on NPM states "as of v11 this module uses the peacenotwar module."

Miller has defended his peacenotwar module on GitHub, saying "this is all public, documented, licensed and open source." Earlier, there were more than 20 issues flagged against node-ipc about its bad behavior, and just now plenty more over on peacenotwar.

Some of the comments referred to Miller's creation as "protestware." Others might call it malware. The programmer was not available for comment.

Someone even claimed an American NGO had their production files on one system trashed by node-ipc as they were running the library on a monitoring server in Belarus with an IP address that triggered the data-wiping code.

The continuing rise of the Node.js JavaScript framework has given the world a whole new type of software vulnerability.

Node's package manager is NPM, which is overseen and owned these days by GitHub along with NPM's registry of modules. This tool makes it easy for Node apps to automatically pull in other libraries of code directly from online repositories. This results in vast numbers of downloads for many modules, meaning that small code changes can propagate very rapidly across large numbers of computers.

The file-dropping version of node-ipc got sucked into version 3.1 of Unity Hub, a tool for the extremely popular Unity games engine although it was removed the same day.

"This hot-fix eliminates an issue where a third-party library was able to create an empty text file on the desktop of people using this release version," the Unity team wrote. "While it was a nuisance, the issue did not include malicious functionality. Any user that had this file appear on their desktop after updating the Unity Hub can delete this file."

This is far from the first time something like this has happened. In 2016, a developer removed his tiny leftpad library from NPM, breaking thousands of other apps. Earlier this year, another developer added a breaking change to his library as a protest.

Infosec firm WhiteSource said earlier this year it detected in 2021 1,300 malicious npm packages. It reported them to npm, which quietly removed them.

The rest is here:
JavaScript library updated to wipe files from Russian computers - The Register

US biz to blow $120bn on AI by 2025, says IDC – The Register

Corporate funding splurged on AI technology is expected to grow to $120bn by 2025 in the US, a yearly increase of 26 percent over the next four financial years, according to IDC.

The two largest industries ramping up investments in machine learning are retail and banking, according to the market research firm. Together they are predicted to make up 28 percent, nearly $20bn, of investments by 2025. The fastest rate of spending increase, however, will come from media and financial trading businesses. AI investments for these markets are projected to grow 30 percent year over year. Automated claims processing and IT optimization will be growth areas, increasing 30 and 29.7 percent respectively every year until 2025.

"The greatest potential benefit for the use of AI remains its use in developing new business, and building new business models," Mike Glennon, senior research manager with IDC's Customer Insights & Analysis team said.

"However, existing businesses are hesitant to embrace this potential, leaving the greatest opportunities to new market entrants that have no fear of change and can adapt easily to new ways of conducting business. The future for business is AI and those companies that can seize this opportunity could easily become the new giants."

There are different levels of risk for different types of industries when adopting AI. In retail, for example, IDC reckons most funding will go towards "augmented customer service agents" and "expert shopping advisors and product recommendations", which will account for nearly 40 percent of AI spending in retail and more than 20 percent of the total funding in 2025. A separate report from Gartner said companies were set to spend $7bn on AI chatbots.

Before the latest AI boom, retailers were already using software to automate customer service and advertise products online. Switching over to a newer form of technology that's more efficient and effective isn't as risky compared to industries that never had those capabilities in place before.

Banking is similar in that respect. Online services like fraud analysis or threat intelligence are some of the areas that are expected to become increasingly powered by AI, and these capabilities were already previously handled by software.

The roll-out of AI in industries thus is increasing, though the tech is still considered high risk in healthcare and transportation.

Read the original:
US biz to blow $120bn on AI by 2025, says IDC - The Register

AlmaLinux OS Foundation welcomes AMD to the fold – The Register

The AlmaLinux OS Foundation is pulling in new members from the world of mainframes, hosting and IT services to contribute to the project and deliver a community-supported Linux compatible with Red Hat Enterprise Linux (RHEL).

The non-profit organization that oversees AlmaLinux said four new entrants had arrived, with AMD, BlackHOST, and KnownHost joining at the Silver Member level, and Sine Nomine Associates joining the Gold tier.

The foundation expects the contributions from these new members to help in bring AlmaLinux closer to full parity with RHEL.

AlmaLinux was started up last year in response to Red Hat's decision to effectively kill off the CentOS project it had operated up to that point and replace it with CentOS Stream, a kind of preview of what to expect in RHEL rather than a binary-compatible build.

"We founded the AlmaLinux OS Foundation for the specific goal of creating a CentOS successor that allowed those who had a stake in the future of the operating system to also have a voice," chair Benny Vasquez said in a statement welcoming the new members.

AMD said it is joining the AlmaLinux OS Foundation in order to sustain support for its products. The firm has seen growing adoption in the enterprise and high-performance computing (HPC) space for its Epyc processors, and so has an interest in ensuring that production-grade enterprise Linux distributions run smoothly on its processors.

Sine Nomine Associates is a custom engineering and development firm that provides consulting and support services to universities, plus government, banking, and finance sectors. The firm claims to have pioneered the virtual server farm concept using Linux on IBM mainframe systems.

KnownHost is a managed web hosting provider, and therefore has a stake in a free and open-source enterprise-grade Linux, as its COO Daniel Pearson explained in a statement.

"Web hosting runs on Linux and AlmaLinux provides a clean CentOS migration path and strong community engagement. By joining the AlmaLinux OS Foundation, we will continue to provide the best web hosting technology solutions for our customers," Pearson said.

BlackHOST offers a range of IT services for businesses and enthusiasts, but claims to specialize in unmetered networking hardware ranging from 1Gbps up to 100Gbps, utilizing datacenters and network points of presence around the world.

BlackHOST CTO Thomas Nuchatel said it has made AlmaLinux its default choice for clients seeking virtual private servers and dedicated servers.

"Linux is a key technology in web hosting and a range of other cloud infrastructure services, and AlmaLinux is the type of community-based distribution that provides value to our customers," he said.

The foundation said that AlmaLinux recently passed over 1 million Docker pulls, plus there is now a beta release for AlmaLinux 8.5 for PowerPC. And the foundation has its first Platinum sponsor, Codenotary, and claimed to have released AlmaLinux 8.5 within 48 hours of the latest Red Hat Enterprise Linux release.

Original post:
AlmaLinux OS Foundation welcomes AMD to the fold - The Register