Archive for the ‘Quantum Computer’ Category

This breakthrough could unlock the true power of quantum – Wired.co.uk

There are three kinds of light, says Carmen Palacios-Berraquero, the CEO and co-founder of Nu Quantum a quantum photonics company based in Cambridge. Chaotic light is the stuff we encounter on a daily basis street lamps and light bulbs. Coherent light covers things with structure, like lasers which were first built in 1960, and have had a revolutionary impact on everything from surgery to home entertainment.

Palacios-Berraquero hopes that the third category, single-photon sources, could have an equally transformative effect. At Nu Quantum, she is working on technologies that can emit and detect single photons the smallest possible units of light. Photonic quantum technologies are about manipulating information processing, communicating and securing information encoded in single particles of light, she says. That allows you to do different things more powerful calculations, or better security.

Single photons cant be eavesdropped on or tampered with without the sender and recipient finding out. And they can be used to take advantage of quantum properties such as entanglement to enable more powerful computing and cryptography.

But building them is a really difficult technical challenge. There are only a handful of companies around the world no more than six, says Palacios-Berraquero that can reliably and controllably either emit or detect single photons. Nu Quantum is hoping to do both.

The company was spun out of research at Cambridge Universitys Cavendish Lab. Palacios-Berraquero had studied physics as an undergraduate and been drawn to the beauty of the interactions between light and matter. During her PhD, she developed a new technique for producing single-photon emitters and adapted it to work on ultra-thin crystals of hexagonal boron nitride a tiny defect in the crystal traps an electron, which then gives off photons.

She began the process of patenting it, and feeling disillusioned with academia started exploring potential commercialisation opportunities for her single-photon emitters. At around the same time, she was introduced to Matthew Appplegate, another Cavendish researcher who had developed a way of detecting single photons. What was already a solid business idea with some investment became a portfolio approach, in which I had invented a single photon source, and Matthew had invented a single photon detector, she says.

Nu Quantum has won 3.6m in government grants, and has just started working with BT, Airbus and other partners to test potential uses for its components. In September 2020 it closed a 2.1m seed round which will help fuel rapid growth and a move into a state of the art photonics lab in Cambridge.

The first product set for launch in 2022 will be a quantum random number generator, which will take advantage of the quantum nature of single photons to generate truly random numbers, based on an algorithm developed by Applegate, now Nu Quantums CTO. There are potential applications for video games, gambling, cloud security and communication where random numbers are used to generate the keys that scramble encrypted messages. The technology could also play a role in distributing those keys Nu Quantum is working with BT on a pilot that will generate, emit and detect quantum keys and make telecoms more secure. We are aspiring to be much more than the sum of the parts, says Palacios-Berraquero. The aspiration is something much bigger.

Amit Katwala is WIRED's culture editor. He tweets from @amitkatwala

Much of the world is suffering with a second wave but China isnt. Heres how it crushed Covid-19

Its tempting to share passwords with you partner but you should stop

Nigerian Prince email scams have a long legacy, including boosting the growth of a notorious police unit

Listen to The WIRED Podcast, the week in science, technology and culture, delivered every Friday

Follow WIRED on Twitter, Instagram, Facebook and LinkedIn

Read the original:
This breakthrough could unlock the true power of quantum - Wired.co.uk

The Bulletin announces its 2020 Leonard M. Rieser Award – Bulletin of the Atomic Scientists

Quantum computing will have an impact on national security, just not in the way that some of the policy community claims that it will.

The Bulletin of the Atomic Scientists has namedJake Tibbettsasits 2020Leonard M. Rieser Award recipientfor hisFebruary 11 essay Keeping classified information secret in a world of quantum computing. The article was selected by the Bulletins editorial team from its Voices of Tomorrow columna column that promotes rising experts who write with distinction on topics including nuclear risk, climate change, and disruptive technologies.

Tibbetts is a mastersstudent at University of California, Berkeley, where he is studying electrical engineering and computer science and researching the application of machine learning to nuclear safeguards. Heis a fellow at the Nuclear Science and Security Consortium and a former research associate at the Center for Global Security Research at Lawrence Livermore National Laboratories.

Tibbetts was also involved in the creation of SIGNAL, an online three-player experimental wargame in which three countries, some armed with nuclear weapons, attempt to achieve national goals through diplomacy and conflict. SIGNAL is designed to increase understanding of the impact of emerging technologies on strategic stability and nuclear risk reduction. Tibbettsis interested in cybersecurity and national security from both a technical and a policy perspective.

In his piece, Jake Tibbetts accomplished the kind of deep, thoughtful, and well-crafted journalism that is the Bulletins hallmark, editor-in-chiefJohn Mecklinsaid. Quantum computing is a complex field; many articles about it are full of strange exaggerations and tangled prose. Tibbetts piece, on the other hand, is an exemplar of clarity and precision and genuinely worthy of the Rieser Award.

The Rieser Award is the capstone of the BulletinsNext Generation Program, created to ensure that new voices, steeped in science and public policy, have a trusted platform from which to address existential challenges. It is named for physicist Leonard M. Rieser (1922-1998), board chair at the Bulletin from 1984 to 1998.

The Leonard Rieser Award is designed to inspire thought-provoking scientific essays that can contribute to advances in public policy, saidTim Rieserwho, along with his brother Len and sister Abby, helped establish the Rieser Award in their fathers honor. Jake Tibbetts, this years awardee, has done us all a service by tackling quantum computing and the so-called race for quantum supremacy. The hype surrounding that race, he argues, may be obscuring a more serious issue the need to protect existing encrypted information against future decryption techniques.As someone who has had access to encrypted information, I congratulate Mr. Tibbetts and the Bulletin for highlighting a subject that has serious implications for us all and deserves greater attention.

The Rieser Award includes a $1,000 cash prize and a one-year subscription to the Bulletinsonline magazine. The Rieser Award recipient is also invited to offer remarks at the Bulletins annual dinner in November. More about the award, Leonard M. Rieser, previous recipients, and all Voices of Tomorrow authors,can be found here.

To support to the Bulletins Next Generation programsvisit our gift page.

More here:
The Bulletin announces its 2020 Leonard M. Rieser Award - Bulletin of the Atomic Scientists

University of Texas at San Antonio and Port of San Antonio partner to boost supply chain security and data innovation – Security Magazine

University of Texas at San Antonio and Port of San Antonio partner to boost supply chain security and data innovation | 2020-12-16 | Security Magazine This website requires certain cookies to work and uses other cookies to help you have the best experience. By visiting this website, certain cookies have already been set, which you may delete and block. By closing this message or continuing to use our site, you agree to the use of cookies. Visit our updated privacy and cookie policy to learn more. This Website Uses CookiesBy closing this message or continuing to use our site, you agree to our cookie policy. Learn MoreThis website requires certain cookies to work and uses other cookies to help you have the best experience. By visiting this website, certain cookies have already been set, which you may delete and block. By closing this message or continuing to use our site, you agree to the use of cookies. Visit our updated privacy and cookie policy to learn more.

More:
University of Texas at San Antonio and Port of San Antonio partner to boost supply chain security and data innovation - Security Magazine

VIEW: 5 technology trends that will disrupt the future – CNBCTV18

No one could have predicted the impact that technology has had on our lives this year. Amidst a global pandemic, our health, economy, businesses and livelihoods have been upended. At the same time, the rate at which companies have had to adapt to survive has meant many have adopted disruptive technologies at a more rapid rate than we could have imagined.

These new technology trends are set to transform businesses. While no one can truly predict what the future holds, there are benefits to be gained from these five technological innovations for a competitive advantage.

All-photonics networks (APN) will power next-generation communication

It is no secret that power and energy consumption from IT systems has had a large and detrimental effect on the environment. However, the introduction of all photonic networks (APNs) can significantly reduce this impact.

APNs use optical and hybrid cabling for end-to-end information transmission between terminals and servers. This allows for the transfer of large volumes of traffic while keeping latency low. The technique uses one-hundredth of the power consumption required by todays networks.

As well as clear environmental benefits, these networks are intuitive, allowing people to connect from any location or environment. In time, experts expect that transmission capacity could increase to the extent that you could download 10,000 2-hour movies in a fraction of a second. The result is a next-generation communications platform that represents a major leap forward towards a smart, sustainable and energy-efficient business.

Cognitive Foundation technology will connect and control everything

Cognitive Foundation (CF) technology links virtualised ICT resources and integrates them with diverse systems and networks to create a robust information-processing platform. CF can analyse and forecast data without being constrained by the format or systems in which data resides.

This allows businesses to orchestrate information from various interfaces including voice and video to sensor data from the Internet of Things. CF provides a centralised place for IT to manage all of its ICT resources from the foundation for innovative projects like smart cities.

In fact, CF is used by the City of Las Vegas in a ground-breaking project that combines various data points to predict and prevent incidents. The City uses orchestration capabilities, based on virtualisation software, to analyse video, voice, and sensor information automatically. The City is now looking at how to evolve the system into a fully automated and autonomous operation that can, not only analyse automatically but think and act on its own.

Digital twin computing (DTC) integrates the real and virtual worlds to predict the future

Digital twins are not new. They are virtual representations of real-world environments, products or assets used to test or simulate the impact of new and different environments. For example, digital twins are used by manufacturers to manage the performance and effectiveness of new machines or plants and by city planners to simulate the impact of new developments and roads.

Digital twins can be used to simulate environments and also assist in designing solutions themselves. By freely copying, combining and exchanging various digital twins of things and people, information is integrated into applications such as traffic congestion prediction systems. Digital twins could even go as far as to make accurate predictions in the field of disease control.

A person could even have their own digital twin. The twin could perform certain routine tasks in cyberspace, in place of the actual person. The twin could even make decisions online. The technology could integrate peoples minds, thinking, habits and attitudes into their digital twin.

Of course, there is the matter of ethics and social responsibility when it comes to such innovations. But, as the application of digital twins and regulation continues to evolve, the impact for businesses and productivity is clear.

The rise of the citizen developer: How robotic processes automation will reshape business

With tech giants including Google, Amazon and Facebook offering AI-as-a-service and data-as-a-service, we are seeing the birth of the citizen developer. These companies offer tools that range from robotic process automation (RPA) to graphics processing units in the cloud. The move means anyone can create business applications using company data with little to no coding skills.

This is set to be a game-changer for many businesses who could build simple process applications, with very little oversight, to automate certain tasks and processes. This will free up time for employees to focus on higher-value work.

Business users are often better subject matter experts as well as being closer to the challenges with an understanding of the best ways to solve them. By putting them in the driving seat, organisations will be able to accelerate digital transformation.

RPA has the potential to transform the future of work. But, as new complexities are added, companies will need to establish the correct data strategy with flexible intelligent infrastructure and open systems to make this innovation accessible, but also safe, for all parties.

Quantum and edge computing ushers in a new era

The rise of powerful computing capability that gives more processing at or near the source of data is already starting to transform companies of all sizes. Two computing paradigmsquantum computing and edge computingare at the forefront of innovation.

Quantum computers solve problems that are too difficult for a traditional computer to solve using extra power. Whereas a traditional computer processes information in 1-2 seconds, in the quantum world, those 1 and 0 bytes can exist in two states, called qubits, simultaneously, allowing computations to be performed in parallel. Quantum computers require special algorithms that are capable of performing tasks we would never imagine possible, making them more powerful than anything built to date.

Edge computing, on the other hand, focuses on processing information closer to the source, for increased speed. Today, most computing takes place in the cloud with the potential for latency. Edge computing requires custom chips and hardware but works alongside the cloud to leverage its benefits without latency. For example, edge computing would allow an autonomous cars computer vision system to process and recognise images immediately, rather than sending the data to the cloud for verification.

With as many as 50 billion devices online in the future, all generating data, edge computing will be needed to deliver the Internet of Things and 5G. It will enable near real-time applications and artificial intelligence (AI) at the edge. As virtual reality (VR) becomes more popular, and more processes happen in headsets, edge computing will play a vital role in delivering a good VR experience.

Technology undoubtedly has the power to transform. Despite the challenges the world faces, businesses have an opportunity to accelerate change by tapping into the very latest innovations. From reducing power consumption to innovative ways of moving and analysing data, these five technology trends are set to help companies carve out a much needed competitive edge as we approach 2021 and beyond. Now, more than ever, it is a time to reflect, learn and refocus on crafting a future centred on improving our wellbeing and a more sustainable environment.

See the original post:
VIEW: 5 technology trends that will disrupt the future - CNBCTV18

Imperfections Lower the Simulation Cost of Quantum Computers – Physics

November 23, 2020• Physics 13, 183

Classical computers can efficiently simulate the behavior of quantum computers if the quantum computer is imperfect enough.

With a few quantum bits, an ideal quantum computer can process vast amounts of information in a coordinated way, making it significantly more powerful than a classical counterpart. This predicted power increase will be great for users but is bad for physicists trying to simulate on a classical computer how an ideal quantum computer will behave. Now, a trio of researchers has shown that they can substantially reduce the resources needed to do these simulations if the quantum computer is imperfect [1]. The arXiv version of the trios paper is one of the most Scited papers of 2020 and the result generated quite a stir when it first appeared back in FebruaryI overheard it being enthusiastically discussed at the Quantum Optics Conference in Obergurgl, Austria, at the end of that month, back when we could still attend conferences in person.

In 2019, Google claimed to have achieved the quantum computing milestone known as quantum advantage, publishing results showing that their quantum computer Sycamore had performed a calculation that was essentially impossible for a classical one [2]. More specifically, Google claimed that they had completed a three-minute quantum computationwhich involved generating random numbers with Sycamores 53 qubitsthat would take thousands of years on a state-of-the-art classical supercomputer, such as IBMs Summit. IBM quickly countered the claim, arguing that more efficient memory storage would reduce the task time on a classical computer to a couple of days [3]. The claims and counterclaims sparked an industry clash and an intense debate among supporters in the two camps.

Resolving the disparity between these estimates is one of the goals of the new work by Yiqing Zhou, of the University of Illinois at UrbanaChampaign, and her two colleagues [1]. In their study, they focused on algorithms for classically replicating imperfect quantum computers, which are also known as NISQ (noisy intermediate-scale quantum) devices [4]. Todays state-of-the-art quantum computersincluding Sycamoreare NISQ devices. The algorithms the team used are based on so-called tensor network methods, specifically matrix product states (MPS), which are good for simulating noise and so are naturally suited for studying NISQ devices. MPS methods approximate low-entangled quantum states with simpler structures, so they provide a data-compression-like protocol that can make it less computationally expensive to classically simulate imperfect quantum computers (see Viewpoint: Pushing Tensor Networks to the Limit).

Zhou and colleagues first consider a random 1D quantum circuit made of neighboring, interleaved two-qubit gates and single-qubit random unitary operations. The two-qubit gates are either Controlled-NOT gates or Controlled-Z (CZ) gates, which create entanglement. They ran their algorithm for NISQ circuits containing different numbers of qubits, N, and different depths, Da parameter that relates to the number of gates the circuit executes (Fig. 1). They also varied a parameter in the MPS algorithm. is the so-called bond dimension of the MPS and essentially controls how well the MPS capture entanglement between qubits.

The trio demonstrate that they can exactly simulate any imperfect quantum circuit if D and N are small enough and is set to a value within reach of a classical computer. They can do that because shallow quantum circuits can only create a small amount of entanglement, which is fully captured by a moderate . However, as D increases, the team finds that cannot capture all the entanglement. That means that they cannot exactly simulate the system, and errors start to accumulate. The team describes this mismatch between the quantum circuit and their classical simulations using a parameter that they call the two-qubit gate fidelity fn. They find that the fidelity of their simulations slowly drops, bottoming out at an asymptotic value f as D increases. This qualitative behavior persists for different values of N and . Also, while their algorithm does not explicitly account for all the error and decoherence mechanisms in real quantum computers, they show that it does produce quantum states of the same quality (perfection) as the experimental ones.

In light of Googles quantum advantage claims, Zhou and colleagues also apply their algorithm to 2D quantum systemsSycamore is built on a 2D chip. MPS are specifically designed for use in 1D systems, but the team uses well-known techniques to extend their algorithm to small 2D ones. They use their algorithm to simulate an N=54, D=20 circuit, roughly matching the parameters of Sycamore (Sycamore has 54 qubits but one is unusable because of a defect). They replace Googles more entangling iSWAP gates with less entangling CZ gates, which allow them to classically simulate the system up to the same fidelity as reported in Ref. [2] with a single laptop. The simulation cost should increase quadratically for iSWAP-gate circuits, and although the team proposes a method for performing such simulations, they have not yet carried them out because of the large computational cost it entails.

How do these results relate to the quantum advantage claims by Google? As they stand, they do not weaken or refute claimswith just a few more qubits, and an increase in D or f, the next generation of NISQ devices will certainly be much harder to simulate. The results also indicate that the teams algorithm only works if the quantum computer is sufficiently imperfectif it is almost perfect, their algorithm provides no speed up advantage. Finally, the results provide numerical insight into the values of N, D, f, and for which random quantum circuits are confined to a tiny corner of the exponentially large Hilbert space. These values give insight into how to quantify the capabilities of a quantum computer to generate entanglement as a function of f, for example.

So, whats next? One natural question is, Can the approach here be transferred to efficiently simulate other aspects of quantum computing, such as quantum error correction? The circuits the trio considered are essentially random, whereas quantum error correction circuits are more ordered by design [5]. That means that updates to the new algorithm are needed to study such systems. Despite this limitation, the future looks promising for the efficient simulation of imperfect quantum devices [6, 7].

Jordi Tura is an assistant professor at the Lorentz Institute of the University of Leiden, Netherlands. He also leads the institutes Applied Quantum Algorithms group. Tura obtained his B.Sc. degrees in mathematics and telecommunications and his M.Sc. in applied mathematics from the Polytechnic University of Catalonia, Spain. His Ph.D. was awarded by the Institute of Photonic Sciences, Spain. During his postdoctoral stay at the Max Planck Institute of Quantum Optics in Germany, Tura started working in the field of quantum information processing for near-term quantum devices.

A nanopatterned magnetic structure features an unprecedently strong coupling between lattice vibrations and quantized spin waves, which could lead to novel ways of manipulating quantum information. Read More

See the article here:
Imperfections Lower the Simulation Cost of Quantum Computers - Physics