Archive for the ‘Quantum Computer’ Category

#YouthMatters: IBM’s Amira Abbas on quantum computing and AI – Bizcommunity.com

Amira Abbas, research scientist at IBM

Here, Abbas shares more about herself, her achievements, and what made her choose to focus on quantum computing.

Abbas: I feel extremely fortunate because I think I have a super cool role that combines everything I love doing. Im currently a PhD student and my research is directly aligned to the research I do at IBM. In other words, researching for my PhD is my job.

Currently, I spend most of my time trying to figure out how quantum computers can help make artificial intelligence (AI) better. Quantum computers are often viewed as supercomputers that can outperform the computers we use today. But, its actually quite hard to figure out where quantum computers can help us, especially in AI.

I work with the IBM team in Zurich, Switzerland to try and understand this particular problem. I also work with the team in South Africa to teach more people in Africa about quantum computing. I love this balance of research and community work in my role because it requires very different skills and stimulates me in different ways.

Abbas: I grew up in a city called Durban on the east coast of South Africa. I always loved mathematics and used to get really excited as a kid when I saw crazy equations in movies. I would think to myself I wish I could understand those things and do stuff like that. This curiosity and relish to understand mathematics lead me to study actuarial science, which is notoriously heavy on mathematics and statistics.

I then went to work in asset management in Johannesburg for a few years. This was a great learning experience, but I couldnt shake the feeling that something was missing from my life.

Soon after this discovery, I left the financial industry and went back to study a masters in physics specialising in quantum computing. I am now doing my PhD in quantum machine learning and couldnt be happier.

Abbas: I think what excites me most about quantum computing is all the unknowns and things we still have to discover. As a researcher, its a dream to work in a field with so many open questions like how can quantum help AI? How can quantum help Africa and Africa-specific problems? Are quantum techniques even helpful and beneficial to us?

Additionally, there are lots of low-hanging fruit because the field of quantum computing is relatively young and so lots of discoveries are inevitable.

The field itself is also so broad and has attracted a very interesting and diverse community. This makes quantum even more enjoyable - being in a space with cool people and getting to explore fascinating things.

Abbas: I would love to continue to produce high calibre research output in quantum computing.

I want to inspire others to see that it doesnt matter where youre from, what university you are at or what your background is if you believe you can do something meaningful - even in a field as crazy sounding as quantum computing - then you can. It just takes hard work and persistence. So, I just want to keep at it and progress my research career by producing interesting work in the field of quantum computing and AI.

Abbas: In terms of achievements, I think its pretty cool that Im the first African to have received Googles PhD Fellowship award for the category of quantum computing.

I have also placed first at global quantum computing hackathon events, such as the Qiskit Europe Hackathon in 2019 Zurich and the Xanadu Quantum Hackathon in Toronto 2019.

Recently, I was the lead author on a quantum machine learning paper that made the cover of a Nature Research journal.

Otherwise, I have also received multiple scholarship awards and invited speaker requests to numerous quantum and women in science, technology, engineering, and mathematics events.

Abbas: My life in a nutshell: Coffee, research, reading, eating and somehow managing to sleep.

My family often say that I work a bit more than the average person, but when youre working on something youre passionate about, it never feels like work and it never feels like enough.

But on weekends, I try to get out into nature as much as possible. Living in South Africa, I am privileged to be able to experience such wonderful outdoor activities and I love hiking.

Abbas: I always say that science and technology is a lot more like art than people realise. Its crucial to grasp for critical thinking, but you have to find what works for you, and its important as a young person to keep in mind that science and technology are extremely broad just because you dont understand one thing, doesnt mean you wont understand everything.

Its also important for our youth to think about what the future holds, for any country, industry or profession and just how advancements in science and technology will affect that.

Luckily we live in a time where we can have access to high-quality research and ideas through our phones. This is how I came across quantum computing which, for example, has the potential to speed up computations used across finance, logistics, healthcare, and more.

We need to foster our skills locally so that our research can contribute to cutting-edge work and allow us to be ahead of the curve, instead of mere consumers of advanced tech/science.

Abbas: Its really easy to develop a mental 'block' against science and technology. Sometimes people become afraid of maths for example if they dont understand it in high school. This was similar to my experience with physics, in fact, physics was my lowest mark in school because I never really understood it. Now Im doing a PhD in physics which I would have thought impossible. The key is to view science and technology as art and find your niche in this very broad space.

As for advice, I strongly believe that all it takes to achieve your goals is consistent hard work and a balanced lifestyle. If youre still figuring out what your passion is, or feeling as if something in your life is missing, keep upskilling yourself and try to read more about things you normally wouldnt. Maybe one day you will come across the thing that makes you tick, and then hard work can be pleasurable if youre working on something aligned to your passion.

See original here:
#YouthMatters: IBM's Amira Abbas on quantum computing and AI - Bizcommunity.com

Crdit Agricole CIB partners with Pasqal and Multiverse Computing – IBS Intelligence

Crdit Agricole CIB with European tech Pasqal and Multiverse Computing announced a partnership to design and implement new approaches running on classical and quantum computers to outperform state of the art algorithms for capital markets and risk management.

International companies and institutions have started investing heavily in quantum technologies. Europe launched the Quantum Flagship Plan in October 2018, and France recently announced a 1.8 billion investment plan.

Quantum computing is likely to profoundly impact multiple industries in the coming years, including finance. Finance has been making substantial use of algorithms requiring advanced mathematics and statistics so far; it is the turn of quantum physics to help solve quantitative financial problems. In addition, quantum theory and technology, assembled in Quantum Computing, start demonstrating promising applications in capital markets and risk management.

Crdit Agricole CIB has teamed up with two quantum technology companies to apply quantum computing to real-world finance applications. French company Pasqal is developing a quantum computer based on neutral atoms arrays, currently being trialled to build industrial quantum computers. Spanish company Multiverse Computing specialises in quantum algorithms which can run both on quantum and classical computers.

Georges-Olivier Reymond, CEO of Pasqal, said: I strongly believe in that partnership to foster the usage of quantum computing for Finance. To our knowledge, it is the first-ever in which all the stakeholders, software developer, hardware provider and end-user are working together on a problem. All the teams are very excited, and this development will be the cornerstone of future industrial applications for neutral atom quantum computers.

Enrique Lizaso, CEO of Multiverse Computing, said: We are thrilled with the opportunity of working together with Credit Agricole CIB and Pasqal in this ambitious project, that will put into production the most advanced tools currently only used in large non-financial institutions in US and China. This is a landmark project in Finance in the world.

LinkedInWhatsAppFacebook Twitter EmailPrint

See original here:
Crdit Agricole CIB partners with Pasqal and Multiverse Computing - IBS Intelligence

The evolution of cryptographic algorithms – Ericsson

Cryptographic algorithms and security protocols are among the main building blocks for constructing secure communication solutions in the cyber world. They correspond to the locks that secure a house in the physical world. In both, it is very difficult to access the assets inside without a valid key. The algorithms and protocols are based on hard mathematical and computationally infeasible problems, whereas the lock mechanisms are based on the difficulty of solving the physical construction.

Mobile networks are critical infrastructure and heavily use advances in cryptographic algorithms and protocols to ensure the security of the information in the communication and privacy protection for the individuals. In this blog post, we take a detailed look at the cryptographic algorithms and protocols used in mobile communications and share some insights into the recent progress. We give an overview taking into consideration the development from 2G to 5G and beyond. In addition, we present detailed information on the progress toward defining the profiles to be used in the security protocols for the mobile communication systems. Last but not least, we give the current status and future plans for post-quantum cryptographic algorithms and protocols.

It can be hard to get an overview of the cryptographic algorithms used in mobile networks. The specifications are spread out over many documents, published over a period of 30 years by the three standardization organizations: 3GPP, ETSI and GSMA. The algorithms can also have quite cryptic names, with more than one name often given to the same algorithm. For example, GEA5, UEA2, 128-EEA1 and 128-NEA1 are almost identical specifications of SNOW 3G for GPRS, UMTS, LTE and NR respectively.

The 3GPP/GSMA algorithms come in three different types: authentication and key generation, encryption and integrity. The authentication and key generation algorithms are used in the Authentication and Key Agreement (AKA) protocol. The encryption and integrity algorithms are used together or independently to protect control plane and user plane data. An overview of all currently specified algorithms is shown in Figures 1 and 2.

The second generation (2G or GSM) mobile networks have quite low security by todays standards. But GSM was actually the first mass-market communication system to use cryptography, which was both revolutionary and controversial. At the time, export of cryptography was heavily restricted and GSM had to be designed with this in mind. The encryption algorithms A5/1 and A5/2 are LFSR-based stream ciphers supporting 64-bit key length. A5/2 is a so-called export cipher designed to offer only 40-bit security level. Usage of export ciphers providing weak security was common at that time and other standards like TLS also supported export cipher suites.

To further align with export control regulations, the key generation algorithms COMP128-1 and COMP128-2 decreased the effective output key length to 54 bits by setting 10 bits the key to zero. While A5/1 and A5/2 mostly met their design criteria, COMP128-1 was a very weak algorithm and was soon replaced by COMP-128-2 and COMP128-3. When packet-switched data was introduced with GPRS, slightly different algorithms GEA1 and GEA2 were introduced. Similar to A5/1 and A5/2, GEA1 and GEA2 are LFSR-based stream ciphers supporting 64-bit key length, where GEA1 was the export cipher. The export ciphers A5/2 and GEA1 are forbidden to support in phones since many years and COMP128-1 is forbidden to support in both networks and SIM cards. None of the original 2G algorithms were officially published anywhere as they were intended to be kept secret, which was quite common practice at the time. But all were reverse engineered by researchers in academia nearly a decade after their development.

The third generation (3G or UMTS) mobile networks introduced 128-bit security level public encryption and integrity algorithms. In 3G, the algorithms were selected by the ETSI Security Algorithms Group of Experts (SAGE), which has since made recommendations for all the new algorithms for mobile networks. The final decision is always taken by 3GPP SA WG3, the security working group in 3GPP. While many other designs from the same time, such as SSH and TLS, turned out to have significant flaws, the 3G algorithms and their modes of operation are still secure today.

The 3G encryption algorithms UEA1 and UEA2 use the KASUMI block cipher and the SNOW 3G stream cipher, which are slightly modified versions of the MIST block cipher and SNOW 2.0 stream cipher respectively. The integrity algorithm UIA1 is CBC-MAC using KASUMI and UEA2 is a Carter-Wegman MAC based on SNOW 3G. For authentication and key generation, the exact algorithm is not standardized and it is up to the operator to choose the algorithm deployed in their home network and SIM cards. 3GPP defines the Milenage algorithm (based on AES-128) as a well-designed example algorithm and this choice is widely used in practice. All the 3G algorithms have also been specified to be used in 2G.

Figure 1: 3GPP/GSMA algorithms for authentication and key generation - Green algorithms are secure while red algorithms only offer 64-bit security or less.

Figure 2: 3GPP/GSMA algorithms for encryption and integrity protection - Green algorithms are secure while red algorithms only offer 64-bit security or less.

The fourth generation (4G or LTE) mobile networks replaced KASUMI with AES-128. The encryption algorithm 128-EEA2 is AES in counter mode (AES-CTR) while the integrity algorithm 128-EIA2 is AES in CMAC mode. 4G also introduced Tuak, a new algorithm family for authentication and key generation based on Keccak hash algorithm but using slightly different parameters from the one which NIST later standardized as SHA-3. SIM cards are recommended to support both Milenage and Tuak. 4G also introduced an optional algorithm, ZUC, to construct 128-EEA3 and 128-EIA3 algorithms, which are the only optional ones to be supported in implementations. It is also worth mentioning that 3GPP specifies at least two mandatory algorithms due to the security practice of having a backup algorithm.

The fifth generation (5G or NR) uses exactly the same algorithms used in 4G. There are no weaknesses in any of the 4G algorithms and they offer good enough performance when implemented in hardware. However, the currently used algorithms are not suitable for future deployments as they are slow in software, does not support 256-bit keys, and only support 32-bit MACs. Software performance is essential for software implementations in virtualized deployments. While these algorithms are fast enough for 5G when implemented in hardware, they perform far worse than state-of-the art algorithms also in hardware and will likely not be suitable for 6G.

3GPP SA3 and ETSI SAGE have therefore started working together on new virtualization-friendly algorithms suitable for later 5G releases and 6G. It is essential that the new algorithms perform well in software on a wide range of architectures (such as x86, ARM and RISC-V) and that they can also be efficiently implemented in hardware. AES-CTR is already fulfilling these criteria, but would have to be accompanied by a high-performance integrity mode like GMAC. SNOW 3G is not up to the task, but the new cipher SNOW-V would be a perfect fit, outperforming even AES-GCM on x86 processors.

The new algorithms to be introduced to 3GPP will likely support only 256-bit key length and offer at least 64-bit tags. While 128-bit algorithms will be practically secure against quantum computers, cellular networks are increasingly classified as critical infrastructure. Already today, governments and financial institutions often mandate more than 128-bit security level for protection of their communication.

While mobile networks use some algorithms and security protocols specific to 3GPP, most of the security protocols used in 5G such as TLS, DTLS, IKEv2, ESP, SRTP, X.509, and JOSE are standardized or maintained by the Internet Engineering Task Force (IETF). 3GPP has, for many years, had the excellent tradition of updating their security profiles in almost every release following recommendations from academia, IETF and other organizations. A large part of this work has been driven by Ericsson.

The general 3GPP profiles for (D)TLS, IPsec and X.509 specified in TS 33.210 and TS 33.310 apply to many different 3GPP interfaces. 3GPP now has some of the best and most secure profiles for TLS and IPsec. 3GPP was, for example, very early with mandating support for TLS 1.3 and with forbidding TLS 1.1 and all weak cipher suites in TLS 1.2. Best practice today is to encrypt as much information as possible and to do key exchange with Diffie-Hellman to enable Perfect Forward Secrecy (PFS). The profiles are well ahead of most other industries as well as IETFs own profiles. 5G is increasingly referred to as critical infrastructure and as such the security profiling should be state-of-art.

For Rel-16 and Rel-17, 3GPP initiated work items specific to security updates, but similar work has been done for much longer under the general TEI work item. For Rel-17, 3GPP aims to mandate support for SHA-256 in the few remaining places where MD5 or SHA-1 is still in use, introduce Curve25519 for low latency key exchange in IKEv2, enable use of OCSP and OCSP stapling as an alternative to CRL everywhere, mandate support of DTLS-STRP and AES-GCM for SRTP, and introduce deterministic ECDSA.

Updating profiles for cryptographic algorithms and security protocols is a process that takes many years because of backward compatibility, as nodes from one release often have to talk to devices from much older releases. Before any weak algorithms or protocol versions are forbidden, the support of strong alternatives needs to have been mandatory for several releases.

Taking into consideration that 3GPP produces approximately one release every 1.5 years, it is essential to mandate the support of new versions of security protocols as soon as possible like 3GPP did with TLS 1.3. Some drawbacks of TLS 1.2 are that it requires a large amount of configuration to become secure and does not provide identity protection, therefore it should be phased out in the future.

Current best practice is to mandate the support of at least two strong algorithms everywhere, so there is always a strong algorithm supported if one of the algorithms is broken. The National Institute of Standards and Technology (NIST) has long functioned as a global standardization organization for cryptographic algorithms. NIST standardizes algorithms in open competitions, inviting contributions from academia all over the world. Both AES and SHA-3 were designed by researchers from Europe. Recently, the Internet Research Task Force Crypto Forum Research Group (IRTF CFRG) has complemented NIST as a global cryptographic Standards Developing Organization (or SDO) and has standardized algorithms like ChaCha20-Poly1305, Curve25519, EdDSA, LMS, and XMSS. NIST has introduced many of the CFRG algorithms within their own standards.

Broken algorithms were once very common, but essentially all algorithms standardized by NIST, IRTF CFRG and ETSI SAGE since 2000 (such as AES, SHA-2, SHA-3, ChaCha20, KASUMI and SNOW 3G) have remained secure, with no practical attacks. Figure 3 gives an overview of broken, weak or legacy algorithms and security protocols. 3GPP has already forbidden most of these and will likely phase out the rest in future releases.

Figure 3: Broken and legacy cryptographic algorithms and security protocols

A big part of future work in upcoming releases will be to introduce quantum-safe algorithms or Post-Quantum Cryptography (PQC). PQC algorithms are cryptographical algorithms that are secure against attacks from quantum computers, which happens to be most algorithms except RSA and Elliptic-Curve Cryptography (ECC). This is something 3GPP is well prepared for, having already future-proofed protocols like 5G Subscription Concealed Identifier (SUCI) by allowing ciphertexts and public keys to be several thousands of bytes long. If somebody builds a sufficiently large quantum computer, RSA and ECC will likely be broken in a matter of hours.

Small quantum computers already exist, however it is still uncertain when (or if) quantum computers capable of breaking these cryptographic algorithms will be built. 3GPP will likely introduce quantum-safe algorithms long before quantum computers even get close to affecting the security of 3GPP systems. Introducing non-standardized cryptographic algorithms likely introduces more risks than it solves, and both 3GPP and IETF have taken the decision to wait for NIST standardization of PQC algorithms, which is already in the final round and will be ready in 2022-2024. After that, IETF will standardize the use of PQC algorithms in (D)TLS, IKEv2, X.509, JOSE and HPKE and as soon as this is done, 3GPP will introduce the new updated IETF RFCs.

Some of the candidates for post-quantum security level 1 in the final round of NIST PQC standardization are summarized in Figure 4. It seems very likely that one of the lattice-based algorithms will be the main replacement for RSA and ECC, for both Key Encapsulation Mechanisms (KEM) and signatures. KEM provides a simplified interface for key exchange and public key encryption. Lattice-based algorithms have slightly larger public keys, signature and ciphertext sizes than the ones of RSA, but they are even faster than ECC. As can be seen from Figure 4, PQC is very practically useful for most applications. Transition to PQC can be seen as a bigger step than the transitions from 3DES to AES and SHA-1 to SHA-256, as it might require security protocol changes to a larger degree. Note that PQC algorithms are not relying on quantum mechanics and software implementation does not require any new hardware.

Figure 4: Some candidates (post-quantum security level 1) in the third and final round of NIST PQC Standardization. The performance measurements are single-core on Skylake 2.5 GHz https://bench.cr.yp.to/ebats.html (lower is better)

128-bit symmetric algorithms will not be practically affected by quantum computers and NIST is currently labeling AES-128 as post-quantum security level 1. Even so, 3GPP is moving towards increased use of 256-bit keys and algorithms such as AES-256.

More information about the algorithms used in mobile networks can be found in the specification series prepared by the 3GPP SA3 working group. For the main profiles used in the security protocols, check 3GPP TS 33.210 and TS 33.310.

To learn and keep up to date on the latest progress in post-quantum cryptography, follow NIST PQC Standardization.

Learn more about the realities of post-quantum cryptography in our previous blog post from 2020.

Discover how 5G fits into mobile communication network security in our guide to 5G network security

Read our summary of the latest standardization work from 3GPP, Release 16 (5G phase 2)

See more here:
The evolution of cryptographic algorithms - Ericsson

Quantum computing just took on another big challenge, one that could be as tough as steel – ZDNet

Nippon Steel has concluded that, despite the current hardware limitations of quantum computers, the technology holds a lot of promise when it comes to optimizing complex problems.

From railways and ships all the way to knives and forks: the number of products that are made of steel is too high to list and to ensure a steady supply of the sought-after material, Japanese steel manufacturer Nippon Steel is now looking at how quantum computing might help.

The company, which produced a hefty 50 million tons of steel in 2019 (that is, 40% of the total production in Japan) has partnered with Cambridge Quantum Computing (CQC) and Honeywell to find out whether quantum computers have the potential to boost efficiencies in the supply chain.

And after over a year of testing and trying new algorithms, the company has concluded that,despite the current hardware limitations of quantum computers, the technology holds a lot of promise when it comes to optimizing complex problems.

"The results Nippon Steel and Cambridge Quantum Computing were able to achieve indicate that quantum computing will be a powerful tool for companies seeking a competitive advantage," said Tony Uttley, the president of Honeywell Quantum Solutions.

SEE: Building the bionic brain (free PDF) (TechRepublic)

The steel manufacturing process is a highly elaborate affair, involving many different steps and requiring various raw materials before the final product can be built.

Plants start by pre-treating and refining iron ore, coal and other minerals to process them into slabs of steel, which are then converted into products like rails, bars, pipes, tubes and wheels.

In the case of Nippon Steel, where millions of tons of material are at stake, finding the best equation to make sure that the right products are in the right place and at the right time is key to delivering orders as efficiently as possible.

Toss in strict deadlines, and it is easy to see why industry leaders are looking for the most advanced tools possible to model and optimize the whole system, and at the same time reduce operating costs.

For this reason, the use of pen and paper has long been replaced by sophisticated software services, and Nippon Steel has been a long-time investor in advanced computing but even today's most powerful supercomputers can struggle to come up with optimal solutions to such complex problems.

Classical computers can only offer simplifications and approximations. The Japanese company, therefore, decided to try its hand at quantum technologies, andannounced a partnership with quantum software firm CQC last year.

"Scheduling at our steel plants is one of the biggest logistical challenges we face, and we are always looking for ways to streamline and improve operations in this area," said Koji Hirano, chief researcher at Nippon Steel.

Quantum computers rely on qubits tiny particles that can take on a special, dual quantum state that enables them to carry out multiple calculations at once. This means, in principle, that the most complex problems that cannot be solved by classical computers in any realistic timeframe could one day be run on quantum computers in a matter of minutes.

The technology is still in its infancy: quantum computers can currently only support very few qubits and are not capable of carrying out computations that are useful at a business's scale. Scientists, rather, are interested in demonstrating the theoretical value of the technology, to be prepared to tap into the potential of quantum computers once their development matures.

In practice, for Nippon Steel, this meant using CQC's services and expertise to discover which quantum algorithms could most effectively model and optimize the company's supply chain.

To do so, the two companies' research teams focused on formulating a small-scale problem, which, although it does not bring significant value to Nippon Steel, can be resolved using today's nascent quantum hardware.

The researchers developed a quantum algorithm for this "representative" problem and ran it on Honeywell's System Model H1 the latest iteration of the company's trapped-ion quantum computing hardware, which has 10 available qubits and a record-breaking quantum volume of 512. After only a few steps, say the scientists, the System Model H1 was able to find an optimal solution.

"The results are encouraging for scaling up this problem to larger instances," said Mehdi Bozzo Rey, the head of business development at CQC. "This experiment showcases the capabilities of the System Model H1 paired with modern quantum algorithms and how promising this emerging technology really is."

What's more: an optimization algorithm such as the one developed by CQC and Nippon Steel can be applied to many other scenarios in manufacturing, transport and distribution.

Earlier this year, for example, IBM and energy giant ExxonMobil revealed that they had been working together tobuild quantum algorithms that could one day optimize the routing of tens of thousands of merchant shipscrossing the oceans to deliver everyday goods a $14 trillion industry that could hugely benefit from operational efficiencies.

The results from Nippon Steel are the first to emerge followingthe announcement of a partnership between Honeywell and CQC earlier this month. CQC's quantum software capabilities are planned to merge with Honeywell's quantum hardware services in a deal that is expected to make waves in the industry.

By joining forces, the two companies are effectively set to become leaders in the quantum ecosystem. The early results from the trials with Nippon Steel, therefore, are likely to be only the start of many new projects to come, as the two firms apply their complementary expertise to global issues affecting various different industries.

Originally posted here:
Quantum computing just took on another big challenge, one that could be as tough as steel - ZDNet

A new piece of the quantum computing puzzle – Washington University in St. Louis Newsroom

Research from the McKelvey School of Engineering at Washington University in St. Louis has found a missing piece in the puzzle of optical quantum computing.

Jung-Tsung Shen, associate professor in the Department of Electrical & Systems Engineering, has developed a deterministic, high-fidelity two-bit quantum logic gate that takes advantage of a new form of light. This new logic gate is orders of magnitude more efficient than the current technology.

In the ideal case, the fidelity can be as high as 97%, Shen said.

His research was published in May 2021 in the journal Physical Review A.

The potential of quantum computers is bound to the unusual properties of superposition the ability of a quantum system to contain many distinct properties, or states, at the same time and entanglement two particles acting as if they are correlated in a non-classical manner, despite being physically removed from each other.

Where voltage determines the value of a bit (a 1 or a 0) in a classical computer, researchers often use individual electrons as qubits, the quantum equivalent. Electrons have several traits that suit them well to the task: they are easily manipulated by an electric or magnetic field and they interact with each other. Interaction is a benefit when you need two bits to be entangled letting the wilderness of quantum mechanics manifest.

But their propensity to interact is also a problem. Everything from stray magnetic fields to power lines can influence electrons, making them hard to truly control.

For the past two decades, however, some scientists have been trying to use photons as qubits instead of electrons. If computers are going to have a true impact, we need to look into creating the platform using light, Shen said.

Photons have no charge, which can lead to the opposite problems: they do not interact with the environment like electrons, but they also do not interact with each other. It has also been challenging to engineer and to create ad hoc (effective) inter-photon interactions. Or so traditional thinking went.

Less than a decade ago, scientists working on this problem discovered that, even if they werent entangled as they entered a logic gate, the act of measuring the two photons when they exited led them to behave as if they had been.The unique features of measurement are another wild manifestation of quantum mechanics.

Quantum mechanics is not difficult, but its full of surprises, Shen said.

The measurement discovery was groundbreaking, but not quite game-changing. Thats because for every 1,000,000 photons, only one pair became entangled. Researchers have since been more successful, but, Shen said, Its still not good enough for a computer, which has to carry out millions to billions of operations per second.

Shen was able to build a two-bit quantum logic gate with such efficiency because of the discovery of a new class of quantum photonic states photonic dimers, photons entangled in both space and frequency. His prediction of their existence was experimentally validated in 2013, and he has since been finding applications for this new form of light.

When a single photon enters a logic gate, nothing notable happens it goes in and comes out. But when there are two photons, Thats when we predicted the two can make a new state, photonic dimers. It turns out this new state is crucial.

Mathematically, there are many ways to design a logic gate for two-bit operations. These different designs are called equivalent. The specific logic gate that Shen and his research group designed is the controlled-phase gate (or controlled-Z gate). The principal function of the controlled-phase gate is that the two photons that come out are in the negative state of the two photons that went in.

In classical circuits, there is no minus sign, Shen said. But in quantum computing, it turns out the minus sign exists and is crucial.

Quantum mechanics is not difficult, but its full of surprises.

When two independent photons (representing two optical qubits) enter the logic gate, The design of the logic gate is such that the two photons can form a photonic dimer, Shen said. It turns out the new quantum photonic state is crucial as it enables the output state to have the correct sign that is essential to the optical logic operations.

Shen has been working with the University of Michigan to test his design, which is a solid-state logic gate one that can operate under moderate conditions. So far, he says, results seem positive.

Shen says this result, while baffling to most, is clear as day to those in the know.

Its like a puzzle, he said. It may be complicated to do, but once its done, just by glancing at it, you will know its correct.

The McKelvey School of Engineering at Washington University in St. Louis promotes independent inquiry and education with an emphasis on scientific excellence, innovation and collaboration without boundaries. McKelvey Engineering has top-ranked research and graduate programs across departments, particularly in biomedical engineering, environmental engineering and computing, and has one of the most selective undergraduate programs in the country. With 140 full-time faculty, 1,387 undergraduate students, 1,448 graduate students and 21,000 living alumni, we are working to solve some of societys greatest challenges; to prepare students to become leaders and innovate throughout their careers; and to be a catalyst of economic development for the St. Louis region and beyond.

This research was supported by the National Science Foundation, ECCS grants nos. 1608049 and 1838996. It was also supported by the 2018 NSF Quantum Leap (RAISE) Award.

See original here:
A new piece of the quantum computing puzzle - Washington University in St. Louis Newsroom