Archive for the ‘Quantum Computer’ Category

Quantum Computing- The UK and Europe play catch-up with the USA and China. – Electropages

The my Quantum computer is bigger than yours game has played out for many years, and the leading contenders in the Qubits superiority race are the USA and China.

Now Europe wants to get a seat at the big Quantum table, and there are EU consortiums and British led partnerships aiming to not only develop a hyper-fast computer but crucially, one that has many practical applications commercially.

So what are they up against? Well, the machine to beat at present is the Chinese computer called Jiuzhang, which the Chinese claim is just a mere 10billion times faster than Googles current offering. China says this gives them Quantum supremacy, but then they would because thats exactly the term used by Google to describe its Quantum offering.

Is there a difference between the Chinese machine and Googles? Yes, there is. Jiuzhang makes its calculations using optical circuits, whereas Google's uses Sycamore, which is superconducting materials on a chip, a design that resembles classical computers.

But, in the technological chest-thumping world of Quantum computing, there is just one boast that everyone wants to make, and that is, mines the fastest.

In the need-for-speed, Chinas Jiuzhang computer is claimed to be 100 trillion times faster than supercomputers. This means in seconds. It can do what normal computers would take millions of years to achieve. These figures are impressive, but a word of caution here does depend on what test the Quantum computer was given to perform as different tests can produce different computational speed results.

Nevertheless, the speed of true Quantum computing is mind-boggling, to say the least, and the real question is how these speeds are achieved? Qubits are how.

Normal computers can only calculate using bits that have only two working states that of 0 or 1. Quantum machines have bits (Qubits) that can provide numerous different states simultaneously. This is what gives them a tremendous speed boost. Get a load of these Qubits in a synchronised linkage, and they can calculate in seconds what would take a conventional computer millions of years.

Qubits represent atoms, ions, photons or electrons and give Quantum computers their inherent parallelism. This means that whereas a conventional computer will work on a single calculation, a Quantum computer can simultaneously work on millions.

But its not just all about the speed. Quantum computing falls in a big way in three areas, and these are, firstly, exactly what tests were made to achieve certain speed results. Secondly, are Quantum computers reliable and, thirdly, what practical applications can they handle that makes them a commercially viable proposition?

The point about speed tests is that not all speed tests are created equal. Quantum computers have to be set up to perform a specific function. To test Jiuzhang, the computer had to calculate the output of a complex circuit that used light. It detected an average of 40 outputs, and its time to do that was a mere three minutes, whereas one of the worlds fastest supercomputers would have taken two billion years to reach the same conclusion. But this was a specially-tailored test and didnt necessarily have relevance to broader applications in the commercial world.

Googles Sycamore testing also came into scrutiny from rival IBM, and again the discussion came down to how relevant was the testing in terms of real-world practicality.

So given these out-of-this-world performance figures, it makes Hitch Hikers Guide to the Galaxys supercomputer Deep Thought look pretty pedestrian. It took Deep Thought a pedestrian 7.5 million years to decide the answer to the question of life, the universe and everything was 42.

Another operational shortfall with Quantum computing is reliability. By their very nature, Qubits are not durable and can easily be upset and need to be in a perfect, temperature-controlled environment that is totally free of vibrations and ambient atomic structures. This, of course, can be created to keep the Qubits bits happy. Still, the length of time they will operate efficiently and accurately is minimal before they technically slow down and abdicate their Quantum coherence.

So while we are all astonished at examples of their computational speeds, Quantum computers are not anywhere near becoming a commercially viable proposition.

Enter the first European consortium that has ambitions to change all that. Its snappily titled the German Quantum Computer based on Superconducting Qubits (GeQCoS) group. Munich chip-maker Infineon and scientists from five research institutes in Germany aim to drive forward the development and industrialisation of Quantum computing.

According to Infineon, Quantum computers have the potential to replace existing conventional computers in specific applications. They could, for example, calculate simulations of complex molecules for the chemical and pharmaceutical industry, complicated optimisations for the automotive and aviation industry, or new findings from the analysis of complex financial data.

The project is funded by the German Ministry of Education and Research and hopes to create a Quantum processor based on superconducting Qubits and demonstrate its special capabilities on a prototype within four years. Working together to achieve this are scientists at the Walther Meisner Institute of the Bavarian Academy of Sciences and Humanities and the Technical University of Munich, the Karlsruhe Institute of Technology, the Friedrich Alexander University of Erlangen-Nuremberg, the Forschungszentrum Jlich and the Fraunhofer Institute for Applied Solid State Physics and Infineon.

If we in Germany and Europe dont want to be dependent for this future technology solely on American or Asian know-how, we must move forward with the industrialisation now, explained Sebastian Luber, senior director of technology & innovation at Infineon.

Naturally, Germany is not alone in its bid to gain Quantum supremacy. The VTT Technical Research Centre of Finland is also part of a consortium seeking a Quantum technology lead.

It correctly believes superconducting processors could become a key ingredient for creating the next generation of supercomputers. Firstly, they could help tackle the major challenge of scaling up Quantum computers and secondly, they could speed up traditional supercomputers and drastically cut their power consumption.

A multidisciplinary research project led by VTT will tackle one of the main technical challenges to achieve this, the data transfer to and from low temperatures required for superconductivity.

The VTT consortium consists of Tampere University in Finland, KTH Royal Institute of Technology in Sweden, ETH Zrich in Switzerland and PTB, the national metrology institute of Germany, and corporate partners Single Quantum in the Netherlands and Polariton Technologies in Switzerland. It is a three-year project.

We know that a Quantum computer's processing power is based on superconducting Qubits operating at extremely low temperatures, and Qubits are typically controlled by conventional electronics at room temperature and connected through electrical cables. However, when the number of Qubits eventually rises to the required level of hundreds of thousands, the number of control cables to match the number of Qubits will generate an extreme heat-load that considerably inhibits Quantum's speed processors.

One solution is to control the Quantum processor with a nearby classical processor. A promising solution is to use the single flux Quantum (SFQ) technology which emulates traditional computers in logic but uses superconducting technology instead of conventional semiconductors. Because it requires low operational temperatures, SFQ has rarely been used in traditional computers. This disadvantage, however, turns into an advantage when used in combination with superconducting Quantum computers.

But a major challenge remains. Calculation instructions come to the SFQ processor from a conventional supercomputer, and calculation results must be sent back from the SFQ processor to the same machine. This requires data transfer between extremely low temperatures and room temperatures which doesnt suit conventional semiconductors.

The VTT projects vision is to replace electrical cables with optical fibres and suitable converters which convert optical signals to electrical signals and vice versa. Unlike existing solutions, these components must be able to operate at low temperatures. This will require the development of innovative converters that can drive and read out a simple SFQ processor.

Besides Quantum computers, conventional supercomputers could benefit from the development of optical connections for SFQ technology. A major limitation of supercomputers is the extremely high-power consumption of CPUs and GPUs due to the silicon chips' energy dissipation. Replacing silicon chips with superconducting SFQ chips in GPUs could have a notable impact on supercomputers' performance and power consumption.

Here in the United Kingdom, Oxford Instruments Nanoscience announces significant innovation in its Cryofree dilution refrigerator technology. It believes the advancement of its ProteoxLX, a dilution refrigerator, will take the research into Quantum computing to the next level, enabling its commercialisation globally.

Since the launch of Proteox at APS Physics last year, Oxford Instruments has announced its partnership with the University of Glasgow and Rigetti and Oxford Quantum Circuits. Oxford Instruments NanoScience has also secured significant wins outside of Europe, more recently with Proteox selected by SpinQ Technology in China.

NanoScience is committed to driving leadership and innovation to support the development and commercialisation of Quantum computing around the world, explained Stuart Woods, managing director of Oxford Instruments NanoScience.

The ProteoxLX can maximise Qubit counts with large sample space and ample coaxial wiring capacity, low vibration features for reduced noise and support of long Qubit coherence times and full integration of signal conditioning components.

The LX also provides two fully customisable secondary Inserts for an optimised layout of cold electronics and high-capacity input and output lines, fully compatible and interchangeable across the Proteox family. Finally, the ProteoxLX offers 25 W cooling power available at 20 mK, low base temperature at < 7 mK, and twin pulse tubes providing up to 4.0 W cooling power at 4 K.

All these UK and EU corporate and academic consortium driven projects to advance Quantum computing should give the US and Chinese technologists some challenges relative to who stays ahead in the race to develop a commercially viable machine. Still, I dont expect either the US or China will be resting on their Qubit laurels.

More here:
Quantum Computing- The UK and Europe play catch-up with the USA and China. - Electropages

Global Artificial Intelligence in Military Market (2020 to 2025) – Incorporation of Quantum Computing in AI Presents Opportunities -…

DUBLIN--(BUSINESS WIRE)--The "Artificial Intelligence in Military Market by Offering (Software, Hardware, Services), Technology (Machine Learning, Computer vision), Application, Installation Type, Platform, Region - Global Forecast to 2025" report has been added to ResearchAndMarkets.com's offering.

The Artificial Intelligence in military market is estimated at USD 6.3 billion in 2020 and is projected to reach USD 11.6 billion by 2025, at a CAGR of 13.1% during the forecast period.

The Artificial Intelligence in Military market includes major players such as BAE Systems Plc. (UK), Northrop Grumman Corporation (US), Raytheon Technologies Corporation (US), Lockheed Martin Corporation (US), Thales Group (US), L3Harris Technologies, Inc. (US), Rafael Advanced defense Systems (Israel), and IBM (US), among others. These players have spread their business across various countries includes North America, Europe, Asia Pacific, Middle East & Africa, and Latin America. COVID-19 has not affected the Ai in military market growth to some extent, and this varies from country to country. Industry experts believe that the pandemic has not affected the demand for Artificial Intelligence in the military market in defense applications.

Based on platform, the space segment of the Artificial Intelligence in military market is projected to grow at the highest CAGR during the forecast period

Based on platform, the space segment of the Artificial Intelligence in military market is projected to grow at the highest CAGR during the forecast period. The space AI segment comprises CubeSat and satellites. Artificial intelligence systems for space platforms include various satellite subsystems that form the backbone of different communication systems. The integration of AI with space platforms facilitates effective communication between spacecraft and ground stations.

Software segment of the Artificial Intelligence in Military market by offering is projected to witness the highest CAGR during the forecast period

Based on offering, the Software segment is projected to witness the highest CAGR during the forecast period. Technological advances in the field of AI have resulted in the development of advanced AI software and related software development kits. AI software incorporated in computer systems is responsible for carrying out complex operations. It synthesizes the data received from hardware systems and processes it in an AI system to generate an intelligent response. The software segment is projected to witness the highest CAGR owing to the significance of AI software in strengthening the IT framework to prevent incidents of a security breach.

The North American market is projected to contribute the largest share from 2020 to 2025 in the Artificial Intelligence in Military market

The US and Canada are key countries considered for market analysis in the North American region. This region is expected to lead the market from 2020 to 2025, owing to increased investments in AI technologies by countries in this region. This market is led by the US, which is increasingly investing in AI systems to maintain its combat superiority and overcome the risk of potential threats on computer networks. The US plans to increase its spending on AI in the military to gain a competitive edge over other countries.

The North American US is recognized as one of the key manufacturers, exporters, and users of AI systems worldwide and is known to have the strongest AI capabilities. Key manufacturers of Ai systems in the US include Lockheed Martin, Northrop Grumman, L3Harris Technologies, Inc., and Raytheon. The new defense strategy of the US indicates an increase in AI spending to include advanced capabilities in existing defense systems of the US Army to counter incoming threats.

Market Dynamics

Drivers

Restraints

Opportunities

Challenges

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/acjap9

Read more from the original source:
Global Artificial Intelligence in Military Market (2020 to 2025) - Incorporation of Quantum Computing in AI Presents Opportunities -...

The Quantum Age Will Require a Quantum Generation – Fair Observer

Bartlomiej K. Wroblewski / Shutterstock

Past the glow of the Shanghai evening, a single red beam threads its way into the silent stratosphere. It is a laser originating from a laboratory whose machinery few can operate or explain. The laser is meant to bounce off a distant satellite before returning for the purpose of encrypting an otherwise earthly conversation in a manner as secure as it (once) was impossible.

Chinas pursuit of quantum technologies, quantum supremacy and a leadership stake in the much-heralded quantum future awaiting us is as well documented as the United States similar quest. Additionally, opinions concerning quantum computings significance to global security, business and geopolitics range from comprehensive analyses of the industrys experts to the musings of its pluckiest amateurs. Just as bountiful are the resources both formally structured and open-sourced available to anyone interested in the technical functionality, universal physical properties, revolutionary new bits or pioneering logic gates powering such complex, world-changing machinery.

READ MORE

So, rather than using this space for another overloaded elucidation of quantum computings principles, our focus must pivot to the need for, and the already encouraging progress toward, educating the next generation of computer scientists, developers and engineers in what any of these words and concepts mean, what quantum computing is and, just as importantly, what it can be.

What quantum computing can be is the most significant technological, economic and governmental functionality in human history. It will empower its masters to blow away the capabilities available via traditional computers to solve problems in seconds that would take todays machines years to complete. What it will also be for some time is expensive, uncertain and a bit scary. But this is all the more evidence of our need to understand its most fruitful applications as well as its limitations, whatever those might be.

What quantum will be is what the next generation of students the most technologically-skilled cohort ever assembled refers to not as quantum, but simply as computing.

While the hype over quantum computings transformational capabilities across sectors, industries and regions has been building for decades, too many American public policy proposals in the quantum realm have begun and ended with public investment in hardware, sparing little attention or resources for the education of the next generation of engineers who for any national quantum program or policy to succeed must be equipped to use it. Some encouraging developments, however, indicate that the importance of quantum education and a quantum-skilled workforce may finally be taking root. There are several entities leading the charge to identify quantum education as a critical need, as collections of the right leaders in the right rooms (virtual or otherwise) are currently conducting the first wave of conversations necessary to educate the workforce the quantum age will require.

The first such institution is the US Army. Placing a renewed emphasis on the development of its people, and the attraction of top industry talent to roles of public service, the US Army has led the way from a federal standpoint in committing to the modernization of its workforce for the quantum age. Though encouraging, it is important to note that such an undertaking must be generational in its scope and investment to be successful, as the biggest organizations, like the biggest ships, change direction most slowly.

The national security implications of quantum computers are a likely driving force behind the Armys design of its Quantum Leap initiative. With that said, it is encouraging that given those concerns, the Army has responded with a people-first focus on developing, attracting and retaining the kind of talent necessary to steward the weaponry of the future capably and responsibly.

A layer beneath the modernization of federal agencies sits the collaborative approach of the US National Science Foundation, the White House Office of Science and Technology Policy, and a smattering of the countrys largest technology firms referred to as the National Q-12 Education Partnership. The appeal of such a public-private endeavor is clear and mutual, as both Americas public and private entities have a stake in seeing the next generation of quantum leaders developed in the US.

Such partnerships should set ambitious goals for themselves and inclusively embrace the full breadth of talent waiting for them within a generation that is as unprecedentedly tech-savvy as it is diverse. Quantum must be more than yet another driver of inequality. Its transformational potential is too great to hoard in Palo Alto or Cambridge. As such, partnerships like these must emphasize the inclusion of institutions like Americas historically black colleges and universities which have done tremendous work to close achievement gaps in STEM fields to tap their talents as indispensable leaders of this historic educational effort.

Finally, local initiatives to educate the next generation of quantum engineers mark perhaps the most American solution to this challenge of all. University leaders should take a lesson from their counterparts at UC Santa Barbara who are partnering with local school districts to tailor quantum educational programming to students of all ages and ability levels. While federal support for such programming is surely welcome, universities and K-12 institutions need not wait for Washington to start training and identifying the future leaders of a quantum age approaching as fast as the photons flying over Shanghai.

Quantum computers, like traditional computers, televisions, toasters, phones and radios, will be neither good nor bad. But they will be here, available for common personal and business use, soon. Education in their design, functionality and best uses will allow for the formulation of informed, forward-looking, strategic, quantum computing governance policies rooted outside of the binary choice between ignorant cynicism and naive optimism. Rooted, that is, in the messy nuances of reality and the goal of every stubborn innovator to not just build the thing right, but to build the right thing.

The views expressed in this article are the authors own and do not necessarily reflect Fair Observers editorial policy.

Here is the original post:
The Quantum Age Will Require a Quantum Generation - Fair Observer

What is cloud-based quantum computing and How does it work? – Medium

credit: cloud.report

Quantum computers really do represent the future generation of computing. Cloud-based quantum computing is tougher to drag off than AI, therefore the ramp-up is going to be slower, and therefore the learning curve vessel attributable to the rather nebulous science behind it, a sensible, operating quantum computer remains a flight of fancy. Bits are the elemental computing units, however, they will store only two values 0 and 1. Developers use quantum computing to encrypt issues as qubits, that work out multiple mixtures of variables promptly instead of exploring every possibility discretely. The deployment of quantum circuits and therefore the support systems necessary for their operation could be an expensive and troublesome process. Among the scope of the analysis, firms that already use these systems modify cloud-based quantum computing via the platforms they build.

Many startups and technology giants, together with Microsoft, IBM, and Google, acknowledge the worth of creating progress during this field, as this is often so successive major step in technology and computing. Quantum computers area unit lightning-fast compared to a typical Windows 10 computer or a macOS computer that makes them even quicker than the foremost powerful supercomputers we have these days. Once users area unit allowed to access quantum physics-powered computers via the web, then its quantum computing within the cloud.

Rigetti computing could be a startup that has developed a quantum processor thats in operation and Computing 128 qubits. They recently declared a Quantum Cloud Service, that developed on its existing quantum computing within the Cloud programming toolkit. This service can bring each ancient and quantum computer along on one cloud platform to assist users to build applications exploitation the ability of qubit technology.

Bill Gates~ It isnt clear when it will work or become mainstream. There is a chance that within 610 years that cloud computing will offer super-computation by using quantum. It could help use solve some very important science problems including materials and catalyst design.

It will create a distinction in several areas with enhancements in implementation and error correction. This new technology can reach a useful purpose with the participation of a lot of individuals and their collaboration. Cloud-based quantum computing offers an immediate interface to quantum circuits and quantum chips sanctioning final testing of quantum algorithms and provides how that allows individuals to create enhancements in quantum computing. Businesses and other domains will apply by exploitation QC on the cloud and dont ought to look forward to quantum computing technology being mature and widespread.

See original here:
What is cloud-based quantum computing and How does it work? - Medium

Will Quantum Computers Break Bitcoin and the Internet? Heres the Outlook From Quantum Physicist Anastasia Marchenkova – The Daily Hodl

A Quantum physicist is revealing that while quantum computers pose no risk to Bitcoin mining, they threaten the algorithms that keep Bitcoin and the internet secure.

In a recent video, Anastasia Marchenkova argues Bitcoin has a built-in design that protects it against entities using quantum algorithms to mine BTC at a rapid rate.

Lets say one day we actually did discover a quantum algorithm that could solve this faster. Bitcoin is designed to adjust the difficulty if we mine blocks too fast. So even if we found this quantum algorithm, the difficulty would just get harder.

However, the quantum physicist warns that quantum computing poses a serious risk to cryptographic algorithms which keep cryptocurrencies and the internet at large secure.

Theres two common cryptosystems RSA and elliptic curve encryption and these are affected by quantum computers. When youre online, information that you send is encrypted, often with these two. Both of these are vulnerable to attacks by quantum computers which means a large enough quantum computer will be a problem for anyone online

There actually is a quantum algorithm to break RSA and elliptic curve encryption. Bitcoin does use elliptic curve encryption (ECC) to generate the public key, which is created from the private key which authorizes transactions

That means that someone with a large enough and coherent enough quantum computer, with coherence meaning the length of time the quantum information can be stored, can actually get your private key from your public key and thats a very serious problem That private key can then be used to authorize transactions that the owner doesnt want to have happen. So as quantum computers become better and better, the security of RSA and elliptic curve is no longer effective.

Crypto sleuths continue to track the advancement of quantum machines. They have the capability to crack complex mathematical problems using quantum bits, or quibits, which can maintain a superimposition by being in two states at the same time.

While the future of cryptocurrencies may be threatened, Marchenkova says digital assets can adopt developments that can effectively resist quantum-based attacks.

So well need to pick an algorithm that can actually stand up to quantum attacks. We call this post-quantum cryptography which are classical algorithms not based on quantum principles that can stand up to quantum computing attacks. One of the current leading candidates is lattice-based cryptography

Another approach is using asymmetric cryptography like AES (advanced encryption standard) which is weakened by quantum computers but not broken in such a manner like RSA and elliptic curve

There are also other coins already using hash-based cryptography. And so far, like I mentioned, hash-based cryptosystems actually resist quantum computing attacks. We dont know if thats going to hold true forever but so far that seems to be the case.

I

Featured Image: Shutterstock/GrandeDuc

Read the original:
Will Quantum Computers Break Bitcoin and the Internet? Heres the Outlook From Quantum Physicist Anastasia Marchenkova - The Daily Hodl