Archive for the ‘Quantum Computer’ Category

Encryption: How It Works, Types, and the Quantum Future | eSP – eSecurity Planet

Encryption and the development of cryptography have been a cornerstone of IT security for decades and remain critical for data protection against evolving threats.

While cryptology is thousands of years old, modern cryptography took off in the 1970s with the help of the Diffie-Hellman-Merkle and RSA encryption algorithms. As networks evolved and organizations adopted internet communications for critical business processes, these cryptographic systems became essential for protecting data.

Through public and commercial development of advanced encryption methods, organizations from sensitive government agencies to enterprise companies can ensure protected communications between personnel, devices, and global offices. Financial institutions in the 1990s and 2000s were some of the first to incorporate encryption to protect online transactions, particularly as backup tapes were lost in transit.

The race continues for cryptographers to keep encryption systems ahead of cryptanalysts and hackers. Quantum computing attacks already present a real threat to existing standards, making the continued development of encryption pivotal for years to come.

This article looks at encryption, how it fits into cryptology, how cryptographic algorithms work, types, use cases, and more.

Encryption is the act of translating data into secret code (ciphertext) and back again (plaintext) for secure access between multiple parties. With shared protocols and encryption algorithms, users can encode files or messages only accessible to other select clients.

To no ones surprise, the study of cryptography and advancements in encryption are essential to developing cybersecurity. Individuals, small businesses, and enterprise organizations all rely on encryption to securely store and transfer sensitive data across wide-area networks (WAN) like the internet.

Application developers managing sensitive user data must especially beware of increasing regulatory action surrounding data privacy.

Cryptology is the overarching field of study related to writing and solving codes, whereas encryption and decryption are the central processes driving the computer science discipline.

As seen below, cryptography is the methodology and applications for managing encryption schemes, and cryptanalysis is the methodology of testing and decrypting these messages.

Cryptographers versed in the latest encryption methods help cybersecurity companies, software developers, and national security agencies secure assets. Cryptanalysts are the individuals and groups responsible for breaking encryption algorithms for good, bad, and ugly reasons.

Penetration testing and red teamers are critical for remaining vigilant in an ever-changing threat environment and catching the vulnerabilities otherwise missed. Alternatively, advanced persistent threats (APT) are always around the corner trying to do the same.

While there are several encryption schemes, they all share the ability to encrypt and decrypt data through a cryptographic key. This unique key is a random string specifically produced to complete the encryption transaction and the more bits in length and complex a process, the better.

Brute force attacks are among the most common cryptanalytic methods, and the time it takes to break an encrypted message is a recognized indicator of the encryption strength.

For users familiar with password management and the value of complex passwords, this makes sense. The longer and more complex the encrypted message is, the longer itll take to decrypt.

Without encryption, data from users and organizations alike would be widely available for all to see on public networks. Individuals and application developers hold responsibility for using and implementing services secured by a good encryption algorithm.

Not every application or network requires military-grade encryption however, enterprise organizations cant go wrong with the services offering the most strength.

A visible example of the role encryption plays with everyday web traffic is the transition from HTTP to HTTPS protocols witnessed in the last decade. Short for the Hypertext Transfer Protocol, HTTP was central to the World Wide Web development in the 1990s and remains a popular application layer protocol connecting users to internet content through a web browser.

In 1994, Secure Sockets Layer (SSL) emerged to give clients an encrypted method to surf the web. By 1999, its successor the Transport Layer Security (TLS) protocol offered a more robust cryptographic protocol across technical components like cipher suites, record protocol, message authentication, and handshake process. HTTP over SSL or HTTP over TLS, dubbed HTTPS, wasnt immediately adopted by the masses.

Thanks to an industry campaign led by the Electronic Frontier Foundation (EFF) for users, website owners, and hosting providers to prioritize secure web traffic, HTTPS has overcome its less secure older sibling. In 2016, only 40% of websites protected their web pages and visiting users with HTTPS. Five years later, that number is more than 90% of websites, protecting users en masse from web attacks.

Before computer science, two individuals could use an identical key to unlock a shared mailbox or gate. Today, symmetric encryption via block ciphers or stream ciphers works much the same way, offering two or more users the ability to encrypt and decrypt messages with a single, shared key between stakeholders.

Users can establish a symmetric key to share private messages through a secure channel like a password manager. Unfortunately, while symmetric encryption is a faster method, it also is less secure.

Symmetric models rely on the integrity of the private key, and sharing it in plaintext over text or email leaves users vulnerable. Phishing and social engineering are common ways threat actors can obtain a symmetric key, but cryptanalysis and brute force attempts can also break symmetric key ciphers.

In the 1970s, the demand for more secure cryptographic systems was met with computer scientists from Stanford and MIT developing the first examples of asymmetric encryption.

Unlike symmetric cryptography, asymmetric encryption is a complex mathematical process in which two users exchange public and private components to create a shared, unique key. Though more complicated and expensive to implement, asymmetric encryption uses thousands of bits and a robust key generation process to ensure secure communications over distributed networks.

Software developers and organizations increasingly use symmetric and asymmetric encryption methods to give users speed and security in communication.

Also known as hybrid encryption, the bundle of the two methods usually starts with a handshake between users through asymmetric cryptography to establish security. Within the asymmetric connection, parties then use symmetric algorithms for the faster processing of messages.

Cryptography challenges have been met by leading computer scientists, universities, and national security and intelligence agencies. The below section looks at the most substantial standards in the evolution of encryption.

The need for a government-wide standard to encrypt sensitive information was evident in 1973, when the U.S. National Bureau of Standards, nowadays the NIST, made a public request for potential ciphers. The algorithm dubbed the Data Encryption Standard (DES) was developed and proposed by IBM and lead cryptographer Horst Feistel.

By the 1990s, DES received wide criticism for its vulnerability to brute force attacks and its short key size. Triple DES, wherein the DES cipher algorithm ran over data blocks three times, proved to be more secure but insufficient for the online ecosystem and universe of data coming.

Shortly after the release of DES, three computer scientists Whitfield Diffie, Martin Hellman, and Ralph Merkle published their research on public-private key cryptography in 1976. As it came to be known, the Diffie-Hellman-Merkle (DHM) key exchange set a precedent for asymmetric encryption before the global networking boom.

Unlike symmetric encryption methods, which previously used few bits, the DHM key exchange provided for encryption supporting key lengths of 2,048 bits to 4,096 bits.

A year after DHMs findings, three cryptographers Ron Rivest, Adi Shamir, and Leonard Adleman developed the RSA public-key cryptosystem.

The three innovators and MIT patented the RSA algorithm, a proprietary system available through RSA Security until its public release in 2000. Standing the test of time, the RSA algorithm remains the most popular public key cryptographic system today and introduced the concept of digital signatures for authentication.

In 1997, the NIST renewed its call to the public cryptography community for the successor to DES. Two Dutch cryptographers Joan Daemen and Vincent Rijmen submitted the eventual pick known as Rijndael. By 2001, the NIST dubbed it the Advanced Encryption Standard (AES) and officially replaced the use of DES.

AES offered larger and different key sizes with a family of ciphers to choose from and remains one of the most popular standards over 20 years later.

While both DES and AES use symmetric block ciphers, AES uses a substitution-permutation network wherein plaintext goes through multiple rounds of substitution (S-box) and permutation (P-box) before finalizing the ciphertext block. Similarly, a client or application can decrypt the AES message by reversing these S-box and P-box transformations.

Professors at the University of Washington and Columbia University independently published research in 1985 on elliptic curve cryptography (ECC), but it didnt come into widespread implementation until the mid-2000s.

Like RSA, ECC is an encryption algorithm for public key cryptography, but instead of prime numbers for generating key pairs, ECC uses elliptic curves. ECC is faster than RSA with a smaller key size while maintaining its security with the mathematics behind elliptic curves over finite fields.

ECC has proven to be a popular choice for web applications, blockchains, and mobile devices as a fast, lightweight yet secure alternative to RSA. ECC isnt immune to compromise, including threats like twist-security and side-channel attacks.

In 1978, Rivest and Adelman published additional research on a cryptographic method dubbed homomorphic encryption. However, it wasnt until 2009 that a graduate student published research on fully homomorphic encryption (FHE) and set off an exploration period.

Unlike conventional cryptography, homomorphic encryption allows for a set of limited operations on ciphertext without decrypting the message. Homomorphic models includes partial homomorphic (PHE) for a single operation, somewhat homomorphic (SHE) for two functions, and FHE for the broadest operational control over encrypted data.

More than a decade later, companies like Google, IBM, and Microsoft continue to explore FHE capabilities where an organization can process specific data within an encrypted message while maintaining the integrity of the data. FHE remains a maturing cryptographic system with little evidence to date of widespread adoption.

Based on quantum mechanics rather than mathematical operations, quantum computers utilizing Shors algorithm for finding prime factors can break asymmetric standards like DHM, RSA, and ECC within moments.

Post-quantum cryptography (PQC) describes the budding market working to address quantum attacks and secure the next generation of IT environments and data. Agencies like the NIST and NSA continue to release security guidelines against quantum threats, but theres still much to learn of quantum information science (QIS) and no official US standard.

Earlier this month, the White House released a national security memo outlining U.S. administrative objectives for adopting quantum-resistant cryptography. While initial standards are expected by 2024, a full mitigation architecture for federal agencies isnt expected until 2035.

The most common applications for cryptographic systems in IT environments include:

Cryptology long predates todays encryption algorithms for data stored in our pockets and moving across the web. From Julius Caesar to the Enigma code, cryptographic methods continue to become more complex to the benefit and detriment of various actors.

As cryptanalysts and threat actors poke holes in the latest implementations, its natural for the industry and users to upgrade to the most robust available algorithms. Outdated and inadequate cryptographic standards leave organizations and users vulnerable, giving those persistent or capable enough the ability to extract, sell, or ransom sensitive data.

The emergence of post-quantum cryptography is a reality stakeholders must grapple with sooner than later.

Read more:
Encryption: How It Works, Types, and the Quantum Future | eSP - eSecurity Planet

COMPUTEX 2022 Returns to In-Person With Virtual and Physical Exhibition – HPCwire

TAIPEI,May 24, 2022 Today marked the opening of COMPUTEX 2022, held until May 27at Taipei Nangang Exhibition Center, Hall 1. Among the distinguished guests in attendance at the opening ceremony to witness the rapid development of new digital technology were PresidentTsai Ing-Wen, Minister of Economic AffairsWang Mei-Hua, Chairman of Taiwan External Trade Development Council (TAITRA)James Huang, and Chairman of Taipei Computer Association Paul Peng.

PresidentTsai Ing-wen stated, COMPUTEX is an important platform for the global technology industry, which not only enables Taiwanese companies to strengthen their international collaboration and connect to the global market, but also shows the capabilities of TaiwansICT industry to the world. In the future, the development of advanced technologies such as AI, quantum computers, and cloud computing will be highly dependent on chips. Therefore,Taiwanwill leverage its strengths in high-end hardware manufacturing and empowering ICT innovations in various industries to make the overall economy more competitive. Also, we will actively work together with enterprises to accelerate the digital transformation process and to build the next golden decade ofTaiwanstechnology industry.

Over the past twenty years, technology, our shared global language, has empowered the world and resulted in important milestones. Even when facing urgent challenges such as the pandemic and supply chain disruptions, technology has allowed infinite possibilities, said TAITRA ChairmanJames Huang. COMPUTEXs mission has always been to introduce technologies to the world and help make a difference, and this years event offers an upgraded, hybrid exhibition experience. We look forward to stimulating technological innovation and heading into the future with global technology companies.

The leading global ICT companies showcase their innovative technologies and solutions at COMPUTEX. GIGABYTE showcased high-performance computing applications, including AI, 5G, edge computing, intelligent traffic management, security, and gaming and entertainment. Delta Electronics chose to focus on sustainability and presented energy and thermal management solutions for applications such as industrial automation, data center infrastructure, and EV charging. KIOXIA displayed its XG8 series of client SSDs for high-end notebooks, desktops, and workstations. Furthermore, Garage+ Pavilion selected 48 startups to showcase innovative capabilities in numerous fields, including AI, IoT, health care, and green technology.

COMPUTEX 2022 Provides an Overview of the Global Technology Ecosystems

COMPUTEX 2022 features six main themes: Accelerating Intelligence, Connected X-Experience, Digital Resilience, Innovative Computing, Innovations & Startups, and Sustainability. In addition, a virtual exhibition, COMPUTEX DigitalGO, is held from today toJune 6. By making use of diverse channels, COMPUTEX 2022 has created an interactive platform for global engagement and provided a comprehensive overview of the future developments in the global technology ecosystems.

In addition to the comprehensive exhibition, COMPUTEX also offers keynote speeches and forums. This years CEO Keynotes, Advanced Micro Devices, Inc. (AMD), NXP Semiconductors (NXP), Micron Technology, and Supermicro will share their corporate visions from a technology perspective. Microsoft and NVIDIA will also give keynote speeches, streaming live onCOMPUTEXs Youtube channel.

The COMPUTEX Forum will be held onMay 26at Taipei Nangang Exhibition Center, Hall 1, Section J. In the morning, in the first session titled Technology Empowerment, Texas Instruments, Ericsson, NXP, NVIDIA, and Micron Technology will discuss how global technology giants find partners, achieve new advancements, and embrace change.

In the afternoon, in the second session, Delta Electronics will talk on Unceasing Innovation for a Net Zero Future and demystify how businesses are leveraging digital technology to achieve sustainability and reach the 2050 net-zero carbon emissions target. Finally, in the third and final session themed Application Advancements, HTC, IBM, Dassault System, and Nokia Taiwan will discuss the metaverse and how businesses can actively deploy smart living and successfully create new work modes.

Furthermore, Live Studio, a new addition to this years event, will serve as the official news channel for COMPUTEX 2022 and provide participants with the most up-to-date and complete event coverage throughout the show. The Guided Tours are another highlight of the event. Industry KOLs will personally lead the tours, take fans around the booths, and put a brand new spin on technology discovery. In addition, media outlets, including Embedded Computing Design from the US, Dempa Publications fromJapan, and IT Chosun fromSouth Korea, will cover COMPUTEX, showingTaiwansscientific and technological achievements and potential to the world.

This year, COMPUTEX 2022 is being held at Taipei Nangang Exhibition Center, Hall 1. In addition to technology trend sharing, industry application demonstrations, and the fun and interactive live studio and guided tours, there are photo booths for each of the six themes. Participants who take photos in each booth and upload the photos will be entered to win the event organizers limited edition COMPUTEX 2022 NFT. With so many exciting activities to enjoy, COMPUTEX 2022 is an event not to be missed.

To learn more aboutCOMPUTEX, please visit: https://www.computextaipei.com.tw.

About COMPUTEX

COMPUTEX was founded in 1981. It has grown with the global ICT industry and become stronger over the last four decades. Bearing witness to historical moments in the development of and changes in the industry, COMPUTEX attracts more than 40,000 buyers to visitTaiwanevery year. It is also the preferred platform chosen by top international companies for launching epoch-making products.

Taiwanhas a comprehensive global ICT industry chain. Gaining a foothold inTaiwan, COMPUTEX is jointly held by the Taiwan External Trade Development Council and Taipei Computer Association, aiming to build a global tech ecosystem. COMPUTEX uses cross-domain integration and innovation services as the most powerful driving forces for achieving the goal of becoming a new platform for global technological resources.

About TAITRA

The Taiwan External Trade Development Council (TAITRA) isTaiwansforemost trade promotion organization. TAITRA is a public-benefit corporation founded by the Ministry of Economic Affairs by uniting industry and commerce groups from the private sector with the purpose of helping them expand their global reach. Currently, TAITRA has a team of more than 1,300 trade professionals, both domestically and abroad. Headquartered inTaipei, TAITRA operates 5 local offices in Taoyuan, Hsinchu, Taichung, Tainan, and Kaohsiung, as well as 63 branches worldwide. It has also signed cooperation agreements with 319 sister organizations that promote international trade. By forming a comprehensive trade services network that provides zero-time-difference and borderless real-time services, TAITRA continues to work with enterprises to jointly pursue the steady development ofTaiwanseconomy. It is the best partner for your success in business expansion.

Source: COMPUTEX

Read the original:
COMPUTEX 2022 Returns to In-Person With Virtual and Physical Exhibition - HPCwire

IBM Collaboration to Advance AI Research with New Center of Excellence in UAE – HPCwire

ABU DHABI, United Arab Emirates,May 25, 2022 Mohamed bin Zayed University of Artificial Intelligence (MBZUAI)the worlds first graduate, research university dedicated to Artificial Intelligence (AI)has announced plans for a strategic collaboration with IBM. Senior leaders from both organizations signed a Memorandum of Understanding aimed atadvancingfundamental AI research, as well as accelerating the types of scientific breakthroughs that could unlock the potential of AI to help solve some of humanitys greatest challenges.

ProfessorEric Xing, President of MBZUAI, delivered short remarks, as didJonathan Adashek, IBMs Senior Vice President and Chief Communications Officer, andSaad Toma, General Manager, IBM Middle East, andAfrica. The agreement was then signed bySultan Al Hajji, Vice President for Public Affairs and Alumni Relations at MBZUAI andWael Abdoush, General Manager IBM Gulf and Levant.

Were excited to to be among the first research universities in the MENA region to host a Center of Excellence for AI research and development with technology and expertise from a world-leading technological giant like IBM. This center will provide highly valuable resource and collaborative environment to our faculty and students to broaden their work in AI. IBM has a long history of technological innovation, and we look forward to joining their latest efforts in our region and together advance AI technology and commercialization for mutual good, MBZUAI President, ProfessorEric Xingsaid.

Saad Toma, General Manager, IBM Middle East andAfrica, said: This collaboration will help drive innovations in AI which is critical for the future of business and society. Were bringing together some of the brightest minds across both the industry and academia, while reinforcing IBMs commitment to promoting knowledge and skills in critical areas for the UAEs development, where the use of technologies like AI is fundamental.

Central to the collaborationis theestablishment of a new AI Center of Excellence to be based at the universitys Masdar City campus. The Center will leveragethe talents of IBM researchers, in collaboration with MBZUAI faculty and students, and will focus on the advancement of both fundamental and applied research objectives.

The initiative seeks to develop, validate, and incubate technologies that harness the capabilities of AI to address civic, social, and business challenges. Further, the collaboration aims to provide real-life applications, particularly in the fields of natural language processing, as well as AI applications that seek to further climate and sustainability goals, and accelerate discoveries in healthcare.

IBM will provide targeted training and technologies as part of the initiative, which supports the universitys vision to be a global leader for advancing AI and its application for the good of society and business. For example, throughtheIBM Academic Initiative,IBM will provideMBZUAI students and faculty with access to IBM tools, software, courseware and cloud accounts for teaching, learning, and non-commercial research. In addition, through theIBM Skills Academyprogram, MBZUAI will have access to curated AI curricula, lectures, labs, industry use cases, design-thinking sessions, and an AI Practitioner certification.

The planned relationship is subject to the parties reaching definitive agreements.

About IBM

IBM is a leading global hybrid cloud and AI, and business services provider. We help clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs, and gain the competitive edge in their industries. Nearly 3,000 government and corporate entities in critical infrastructure areas such as financial services, telecommunications and healthcare rely on IBMs hybrid cloud platform and Red Hat OpenShift to affect their digital transformations quickly, efficiently, and securely. IBMs breakthrough innovations in AI, quantum computing, industry-specific cloud solutions and business services deliver open and flexible options to our clients. All of this is backed by IBMs legendary commitment to trust, transparency, responsibility, inclusivity, and service. Visitwww.ibm.comfor more information.

About Mohamed bin Zayed University of Artificial Intelligence (MBZUAI)

MBZUAI is a graduate, research university focused on artificial intelligence, computer science, and digital technologies across industrial sectors. The university aims to empower students, businesses, and governments to advance artificial intelligence as a global force for positive progress. MBZUAI offers various graduate programs designed to pursue advanced, specialized knowledge and skills in artificial intelligence, including computer vision, machine learning, and natural language processing. For more information, please visitwww.mbzuai.ac.ae.

Source: IBM

Read more:
IBM Collaboration to Advance AI Research with New Center of Excellence in UAE - HPCwire

A Quantum Leap in the Making Meet Tomorrow’s Super Super Computers – TechNative

Modern computers are incredibly versatile, but even the most potent ones struggle with certain types of calculations and modelling

Now, imagine an entirely different kind of computer, with head-spinning power, using mind-bending quantum mechanics to bring barely believable capabilities to life. A super super computer that can tackle calculations that the most powerful conventional machines would need decades to process in a split second. This contraption which resembles a baroque chandelier that could have hung at Versailles is a quantum computer.

They probably wont replace todays computers dont expect your next laptop to be a quantum device but they will be able to tackle certain boxed and highly complex tasks that force traditional computers to throw in the towel. If there are near-endless possible answers to a clearly defined problem, a quantum computer will find the solution much quicker than any conventional computer.

Quantum computers are powered by qubits (i.e., quantum bits), which, due to the strange properties of quantum mechanics, can exist in something called superposition, which in simplified terms means they exist in both 0 and 1 states simultaneously. Imagine flipping a coin. Itll eventually land on either heads or tails. But if you spin it, you could say that before it settles it is both heads and tails at the same time or, rather, there is a possibility that it can be either of the two. It is in superposition. In order to operate at scale, qubits need to be entangled wired together in superposition. Quantum entanglement, explains IBM, allows qubits, which behave randomly, to be perfectly correlated with each other.

Alas, superposition is fickle, and when decoherence forces a qubit out of superposition, it no longer possesses quantum properties. The solution is called error correction, and quantum computing pioneers like IBM, Microsoft and Google are hard at work making it happen.

For a more comprehensive explanation of quantum computing, check out this primer. And dont miss this irresistible video featuring IBM scientist Talia Gershon explaining quantum computers to five individuals from an eight-year-old to a theoretical physicist from Yale.

Possibilities for quantum use cases include predictive analytics and advanced modeling, which could help streamline and optimize large-scale transit operations and fleet maintenance, energy exploration, disaster prevention and recovery, as well as climate change mitigation. Also on the radar: chemistry simulations of molecules and atoms whose complex behavior is driven by quantum mechanics and simply too hard to handle for conventional machines.Meanwhile, automakers, including Volkswagen, are investigating quantum computing in search of improved battery chemistry for electric vehicles.

In oil refining, massively big machines, called hydrocrackers, are used to upgrade low-quality heavy gas oils into high-quality, clean-burning jet fuel, diesel and gasoline. Extremely complicated and costly to maintain, hydrocrackers may sit idle several months each year, but implementing a predictive modeling application has enabled hydrocracker operators to shave off months of downtime for these behemoths. The idea: Make all acute repairs when the machine is down and use technology to predict what might break next and fix it preemptively. Adding quantum-driven AI as the brain for the hydrocracker could further minimize downtime because the quantum computer could calculate exponentially more scenarios than current technology.

In another example of the immense potential of the technology, bright minds from the University of Glasgows School of Physics & Astronomy recently announced that they have adapted a quantum algorithm called Grovers algorithm to drastically cut down the time it takes to identify and analyze gravitational wave signals.

One of the most interesting use cases is artificial intelligence. Indeed, adding quantum power to AI could be what takes present-day Narrow AI to the next level General AI. The quantum-AI hydrocracker brain described above is a possible example of General AI. Quantum computing could also propel machines toward sentience within specific fields. Imagine computers perfectly empathizing and emulating emotions, with the ability to respond to complex signals, like expressions, eye movement and body language. Perhaps one day, quantum computing could drive us all the way to that barely fathomable third level of AI Super AI where machines outperform humans in every way.

Todays quantum machines are scientific marvels, and they are evolving rapidly. By [2025], IBM says, we envision that developers across all levels of the quantum computing stack will rely upon our advanced hardware with a cloud-based API. The hope is that by 2030, companies and users are running billions, if not a trillion quantum circuits a day. Big Blue, whose most powerful machine currently packs 126 qubits, expects to have an 1121-qubit version in 2023.

Quantum computing is fascinating, promising and just cool. Still, we may need to slow the hype machine down a tad as significant challenges must be overcome before the technology can be commercialized. Functional, stable, production-scale quantum machines could be up to a decade away. But once they materialize, we can start writing software for the quantum stack and begin to realize all these tantalizing quantum computing use cases.

About the Author

Wolf Ruzicka is Chairman atEastBanc Technologies.Wolf is a technology industry veteran with more than 25 years of experience leading enterprise business strategy and innovation. He joined EastBanc Technologies in 2007, originally as CEO. During his tenure, Wolf also served as President of APIphany, a division of EastBanc Technologies, through its acquisition by Microsoft. Wolfs vision and customer-centric approach to digital transformation is credited for helping establish EastBanc Technologies as a leader delivering sophisticated solutions that enable customers to win in todays digital economy. Follow Wolf on LinkedIn.

See the article here:
A Quantum Leap in the Making Meet Tomorrow's Super Super Computers - TechNative

Emulating impossible ‘unipolar’ laser pulses paves the way for processing quantum information – University of Michigan News

The semiconductor nanosheets in the water-cooled copper mount turn an infrared laser pulse into an effectively unipolar terahertz pulse. The team says that their terahertz emitter could be made to fit inside a matchbox. Image credit: Christian Meineke, Huber Lab, University of Regensburg

A laser pulse that sidesteps the inherent symmetry of light waves could manipulate quantum information, potentially bringing us closer to room temperature quantum computing.

The study, led by researchers at the University of Regensburg and the University of Michigan, could also accelerate conventional computing.

Quantum computing has the potential to accelerate solutions to problems that need to explore many variables at the same time, including drug discovery, weather prediction and encryption for cybersecurity. Conventional computer bits encode either a 1 or 0, but quantum bits, or qubits, can encode both at the same time. This essentially enables quantum computers to work through multiple scenarios simultaneously, rather than exploring them one after the other. However, these mixed states dont last long, so the information processing must be faster than electronic circuits can muster.

While laser pulses can be used to manipulate the energy states of qubits, different ways of computing are possible if charge carriers used to encode quantum information could be moved aroundincluding a room-temperature approach. Terahertz light, which sits between infrared and microwave radiation, oscillates fast enough to provide the speed, but the shape of the wave is also a problem. Namely, electromagnetic waves are obliged to produce oscillations that are both positive and negative, which sum to zero.

The positive cycle may move charge carriers, such as electrons. But then the negative cycle pulls the charges back to where they started. To reliably control the quantum information, an asymmetric light wave is needed.

The optimum would be a completely directional, unipolar wave, so there would be only the central peak, no oscillations. That would be the dream. But the reality is that light fields that propagate have to oscillate, so we try to make the oscillations as small as we can, said Mackillo Kira, U-M professor of electrical engineering and computer science and leader of the theory aspects of the study in Light: Science & Applications.

Since waves that are only positive or only negative are physically impossible, the international team came up with a way to do the next best thing. They created an effectively unipolar wave with a very sharp, high-amplitude positive peak flanked by two long, low-amplitude negative peaks. This makes the positive peak forceful enough to move charge carriers while the negative peaks are too small to have much effect.

They did this by carefully engineering nanosheets of a gallium arsenide semiconductor to design the terahertz emission through the motion of electrons and holes, which are essentially the spaces left behind when electrons move in semiconductors. The nanosheets, each about as thick as one thousandth of a hair, were made in the lab of Dominique Bougeard, a professor of physics at the University of Regensburg in Germany.

Then, the group of Rupert Huber, also a professor of physics at the University of Regensburg, stacked the semiconductor nanosheets in front of a laser. When the near-infrared pulse hit the nanosheet, it generated electrons. Due to the design of the nanosheets, the electrons welcomed separation from the holes, so they shot forward. Then, the pull from the holes drew the electrons back. As the electrons rejoined the holes, they released the energy theyd picked up from the laser pulse as a strong positive terahertz half-cycle preceded and followed by a weak, long negative half-cycle.

The resulting terahertz emission is stunningly unipolar, with the single positive half-cycle peaking about four times higher than the two negative ones, Huber said. We have been working for many years on light pulses with fewer and fewer oscillation cycles. The possibility of generating terahertz pulses so short that they effectively comprise less than a single half-oscillation cycle was beyond our bold dreams.

Next, the team intends to use these pulses to manipulate electrons in room temperature quantum materials, exploring mechanisms for quantum information processing. The pulses could also be used for ultrafast processing of conventional information.

Now that we know the key factor of unipolar pulses, we may be able to shape terahertz pulses to be even more asymmetric and tailored for controlling semiconductor qubits, said Qiannan Wen, a Ph.D. student in applied physics at U-M and a co-first-author of the study, along with Christian Meineke and Michael Prager, Ph.D. students in physics at the University of Regensburg.

Collaborators at Justus Liebig University Giessen and Helmut Schmidt University, both in Germany, contributed to the experiment and the characterization of the nanosheets.

This research was supported by the German Research Foundation (DFG), W.M. Keck Foundation and the National Science Foundation.

Study: Scalable high-repetition-rate sub-half-cycle terahertz pulses from spatially indirect interband transitions (DOI: 10.1038/s41377-022-00824-6)

Original post:
Emulating impossible 'unipolar' laser pulses paves the way for processing quantum information - University of Michigan News