Archive for the ‘Quantum Computer’ Category

What’s Next In AI, Chips And Masks – SemiEngineering

Aki Fujimura, chief executive of D2S, sat down with Semiconductor Engineering to talk about AI and Moores Law, lithography, and photomask technologies. What follows are excerpts of that conversation.

SE: In the eBeam Initiatives recent Luminary Survey, the participants had some interesting observations about the outlook for the photomask market. What were those observations?

Fujimura: In the last couple of years, mask revenues have been going up. Prior to that, mask revenues were fairly steady at around $3 billion per year. Recently, they have gone up beyond the $4 billion level, and theyre projected to keep going up. Luminaries believe a component of this increase is because of the shift in the industry toward EUV. One question in the survey asked participants, What business impact will COVID have on the photomask market? Some people think it may be negative, but the majority of the people believe that its not going to have much of an effect or it might have a positive effect. At a recent eBeam Initiative panel, the panelists commented that the reason for a positive outlook might be because of the demand picture in the semiconductor industry. The shelter-in-place and work-from-home environments are creating more need and opportunities for the electronics and semiconductor industries.

SE: How will extreme ultraviolet (EUV) lithography impact mask revenues?

Fujimura: In general, two thirds of the participants in the survey believe that it will have a positive impact. When you go to EUV, you have a fewer number of masks. This is because EUV brings the industry back to single patterning. 193nm immersion with multiple patterning requires more masks at advanced nodes. With EUV, you have fewer masks, but mask costs for each EUV layer is more expensive.

SE: For decades, the IC industry has followed the Moores Law axiom that transistor density in chips doubles every 18 to 24 months. At this cadence, chipmakers can pack more and smaller transistors on a die, but Moores Law appears to be slowing down. What comes next?

Fujimura: The definition of Moores Law is changing. Its no longer looking at the trends in CPU clock speeds. Thats not changing much. Its scaling more by bit width than by clock speed. A lot of that has to do with thermal properties and other things. We have some theories on where we can make that better over time. On the other hand, if you look at things like massively parallel computing using GPUs or having more CPU cores and how quickly you can access memory or how much memory you can access if you include those things, Moores Law is very much alive. For example, D2S supplies computing systems for the semiconductor manufacturing industry, so we are also a consumer of technology. We do heavy supercomputing, so its important for us to understand whats happening on the computing capability side. What we see is that our ability to compute is continuing to improve at about the same rate as before. But as programmers we have to adapt how we take advantage of it. Its not like you can take the same code and it automatically scales like it did 20 years ago. You have to understand how that scaling is different at any given point in time. You have to figure out how you can take advantage of the strength of the new generation of technology and then shift your code. So its definitely harder.

SE: Whats happening with the logic roadmap?

Fujimura: Were at 5nm in terms of what people are starting to do now. They are starting to plan 3nm and 2nm. And in terms of getting to the 2nm node, people are pretty comfortable. The question is what happens beyond that. It wasnt too long ago that people were saying: Theres no way were going to have 2nm. Thats been the general pattern in the semiconductor industry. The industry is constantly re-inventing itself. It is extending things longer than people ever thought possible. For example, look how long 193nm optical lithography lasted at advanced nodes. At one time, people were waiting for EUV. There was once a lot of doom and gloom about EUV. But despite being late, companies developed new processes and patterning schemes to extend 193nm. It takes coordination by a lot of people to make this happen.

SE: How long can we extend the current technology?

Fujimura: Theres no question that there is a physical limit, but we are still good for the next 10 years.

SE: Theres a lot of activity around AI and machine learning. Where do you see deep learning fitting in?

Fujimura: Deep learning is a subset of machine learning. Its the subset thats made machine learning revolutionary. The general idea of deep learning is to mimic how the brain works with a network of neurons or nodes. The programmer first determines what kind of a network to use. The programmer then trains the network by presenting it with a whole bunch of data. Often, the network is trained by labeled data. Using defect classification as an example, a human or some other program labels each picture as being a defect or not, and may also label what kind of defect it is, or even how it should be repaired. The deep learning engine iteratively optimizes the weights in the network. It automatically finds a set of weights that would result in the network to best mimic the labels. Then, the network is tried on data that it wasnt trained on to test to see if the network learned as intended.

SE: What cant deep learning do?

Fujimura: Deep learning does not reason. Deep learning does pattern matching. Amazingly, it turns out that many of the worlds problems are solvable purely with pattern matching. What you can do with deep learning is a set of things that you just cant do with conventional programming. I was an AI student in the early 1980s. Many of the best computer scientists in the world back then (and ever since) already were trying hard to create a chess program that could beat the chess masters. It wasnt possible until deep learning came along. Applied to semiconductor manufacturing, or any field, there are classes of problems that had not been practically possible without deep learning.

SE: Years ago, there wasnt enough compute power to make machine learning feasible. What changed?

Fujimura: The first publication describing convolutional neural networks was in 1975. The researcher, Dr. Kunihiko Fukushima, called it neocognitron back then, but the paper basically describes deep learning. But computational capability simply wasnt sufficient. Deep learning was enabled with what I call useful waste in massive computations by cost-effective GPUs.

SE: What problems can deep learning solve?

Fujimura: Deep learning can be used for any data. For example, people use it for text-to-speech, speech-to-text, or automatic translation. Where deep learning is most evolved today is when we are talking about two-dimensional data and image processing. A GPU happens to be a good platform for deep learning because of its single instruction multiple data (SIMD) processing nature. The SIMD architecture is also good at image processing, so it makes sense that its applied in that way. So for any problem in which a human expert can look at a picture without any other background knowledge and tell something with high probability, deep learning is likely to be able to do well.

SE: What about machine learning in semiconductor manufacturing?

Fujimura: We have already started to see products incorporating deep learning both in software and equipment. Any tedious and error-prone process that human operators need to perform, particularly those involving visual inspection, are great candidates for deep learning. There are many opportunities in inspection and metrology. There are also many opportunities in software to produce more accurate results faster to help with the turnaround time issues in leading-edge mask shops. There are many opportunities in correlating big data in mask shops and machine log files with machine learning for predictive maintenance.

SE: What are the challenges?

Fujimura: Deep learning is only as good as the data that is being given, so caution is required in deploying deep learning. For example, if deep learning is used to screen resumes by learning from labels provided by prior hiring practices, deep learning learns the biases that are already built into the past practices, even if unintended. If operators tend to make a type of a mistake in categorizing an image, deep learning that learned from the data labeled by that operators past behavior would learn to make the same mistake. If deep learning is used to identify suspected criminal behavior in the street images captured by cameras on the street based on past history of arrests, deep learning will try the best it can to mimic the past behavior. If deep learning is used to identify what a social media user tends to want to see in order to maximize advertising revenues, deep learning will learn to be extremely good at showing the user exactly what the user tends to watch, even if it is highly biased, fake or inappropriate. If misused, deep learning can accentuate and accelerate human addiction and biases. Deep learning is a powerful weapon that relies on the humans wielding it to use it carefully.

SE: Is machine learning more accurate than a human in performing pattern recognition tasks?

Fujimura: In many cases, its found that a deep learning-based program can inference better with a higher percentage of accuracy than a human, particularly when you look at it over time. A human might be able to look at a picture and recognize it with a 99% accuracy. But if the same human has to look at a much larger data set, and do it eight hours a day for 200 days a year, the performance of the human is going to degrade. Thats not true for a computer-based algorithm, including deep learning. The learning algorithms process vast amounts of data. They go through small sections at a time and go through every single one without skipping anything. When you take that into account, deep learning programs can be useful for these error prone processes that are visually oriented or can be cast into being visually oriented.

SE: The industry is working on other technologies to replicate the functions of the brain. Neuromorphic computing is one example. How realistic is this?

Fujimura: The brain is amazing. It will take a long time to create a neural network of the actual brain. There are very interesting computing models in the future. Neuromorphic is not a different computing model. Its a different architecture of how you do it. Its unclear if neuromorphic computing will necessarily create new kinds of capabilities. It does make some of them more efficient and effective.

SE: What about quantum computing?

Fujimura: The big change is quantum computing. That takes a lot of technology, money and talent. Its not an easy technology to develop. But you can bet that leading technology countries are working on it, and there is no question in my mind that its important. Take security, for example. 256-bit encryption is nothing in basic quantum computing. Security mechanisms would have to be significantly revamped in the world of quantum computing. Quantum computing used in a wrong way can be destructive. Staying ahead of that is a matter of national security. But quantum computing also can be very powerful in solving problems that were considered intractable. Many iterative optimization problems, including deep learning training, will see major discontinuities with quantum computing.

SE: Lets move back to the photomask industry. Years ago, the mask was simple. Over time, masks have become more complex, right?

Fujimura: At 130nm or around there, you started to see decorations on the mask. If you wanted to draw a circle on the wafer using Manhattan or rectilinear shapes, you actually drew a square on the mask. Eventually, it would become a circle on the wafer. However, starting at around 130nm, that square on the mask had to be written with decorations in all four corners. Then, SRAFs (sub-resolution assist features) started to appear on the mask around 90nm. There might have been some at 130nm, but mostly at 90nm. By 22nm, you couldnt find a critical layer mask that didnt have SRAFs on them. SRAFs are features on the mask that are designed explicitly not to print on the wafer. Through an angle, SRAFs project light into the main features that you do want to print on a wafer enough so that it helps to augment the amount of energy thats being applied to the resist. Again, this makes the printing of the main features more resilient to manufacturing process variation.

SE: Then multiple patterning appeared around 16nm/14nm, right?

Fujimura: The feature sizes became smaller and more complex. When we reached the limit of resolution for 193i, there was no choice but to go to multiple patterning, where multiple masks printed one wafer layer. You divide the features that you want on a given wafer layer and you put them on different masks. This provided more space for SRAFs for each of the masks. EUV for some layers is projected to go to multiple patterning, too. It costs more to do multiple patterning, but it is a familiar and proven technique for extending lithography to smaller nodes.

SE: To pattern a photomask, mask makers use e-beam mask writer systems based on variable shaped beam (VSB) technology. Now, using thousands of tiny beams, multi-beam mask writers are in the market. How do you see this playing out?

Fujimura: Most semiconductor devices are being patterned using VSB writers for the critical layers. Thats working fine. The write times are increasing. If you look at the eBeam Initiatives recent survey, the average write times are still around 8 hours. Going forward, we are moving toward more complex processes with EUV masks. Today, EUV masks are fairly simple. Rectangular writing is enough. But you need multi-beam mask writers because of the resist sensitivity. The resists are slow in order to be more accurate. We need to apply a lot of energy to make it work, and that is better with multi-beam mask writers.

SE: Whats next for EUV masks?

Fujimura: EUV masks will require SRAFs, too. They dont today at 7nm. SRAFs are necessary for smaller features. And, for 193i as well as for EUV, curvilinear masks are being considered now for improvements in wafer quality, particularly in resilience to manufacturing variation. But for EUV in particular, because of the reflective optics, curvilinear SRAFs are needed even more. Because multi-beam mask writing enables curvilinear mask shapes without a write time penalty, the enhanced wafer quality in the same mask write time is attractive.

SE: What are the big mask challenges going forward?

Fujimura: There are still many. EUV pellicles, affordable defect-free EUV mask blanks, high- NA EUV, and actinic or e-beam-based mask inspection both in the mask shop and in the wafer shop for requalification are all important areas for advancement. Now, the need to adopt curvilinear mask shapes has been widely acknowledged. Data processing, including compact and lossless data representation that is fast to write and read, is an important challenge. Optical proximity correction (OPC) and inverse lithography technology (ILT), which are needed to produce these curvilinear mask shapes to maximize wafer performance, need to run fast enough to be practical.

SE: What are the challenges in developing curvilinear shapes on masks?

Fujimura: There are two issues. Without multi-beam mask writers, producing masks with curvilinear shapes can be too expensive or may practically take too long to write. Second, controlling the mask variation is challenging. Once again, the reason you want curvilinear shapes on the mask is because wafer quality improves substantially. That is even more important for EUV than in 193nm immersion lithography. EUV masks are reflective. So, there is also a 6-degree incidence angle on EUV masks. And that creates more desire to have curvilinear shapes or SRAFs. They dont print on wafer. They are printed on the mask in order to help decrease process variation on the wafer.

SE: What about ILT?

Fujimura: ILT is an advanced form of OPC that computes the desired mask shapes in order to maximize the quality of wafer lithography. Studies have shown that ILT in particular, unconstrained curvilinear ILT can produce the best results in terms of resilience to manufacturing variation. D2S and Micron recently presented a paper on the benefits of full-chip, curvilinear stitchless ILT with mask-wafer co-optimization for memory applications. This approach enabled more than a 2X improvement in process windows.

SE: Will AI play a big role in mask making?

Fujimura: Yes. In particular, with deep learning, the gap between a promising prototype and a production-level inference engine is very wide. While there was quite a bit of initial excitement over deep learning, the world still has not seen very much in production adoption of deep learning. A large amount of this comes from the need for data. In semiconductor manufacturing, data security is extremely important. So while a given manufacturer would have plenty of data of its own kind, a vendor of any given tool, whether software or equipment, has a difficult time getting enough customer data. Even for a manufacturer, creating new data say, a SEM picture of a defect can be difficult and time-consuming. Yet deep learning programming is programming with data, instead of writing new code. If a deep learning programmer wants to improve the success rate of an inference engine from 92% to 95%, that programmer needs to analyze the engine to see what types of data it needs to be additionally trained to make that improvement, then acquire many instances of that type of data, and then iterate. The only way this can be done efficiently and effectively is to have digital twins, a simulated environment that generates data instead of relying only on physical real sample data. Getting to 80% success rate can be done with thousands of collected real data. But getting to 95% success rate requires digital twins. It is the lack of this understanding that is preventing production deployment of deep learning in many potential areas. It is clear to me that many of the tedious and error-prone processes can benefit from deep learning. And it is also clear to me that acceleration of many computing tasks using deep learning will benefit the deployment of new software capabilities in the mask shop.

Related Stories

EUVs Uncertain Future At 3nm And Below

Challenges Linger For EUV

Mask/Lithography Issues For Mature Nodes

The Evolution Of Digital Twins

Next-Gen Mask Writer Race Begins

See the original post:
What's Next In AI, Chips And Masks - SemiEngineering

CCNY & partners in quantum algorithm breakthrough | The City College of New York – The City College of New York News

Researchers led by City College of New York physicist Pouyan Ghaemi report the development of a quantum algorithm with the potential to study a class of many-electron quantums system using quantum computers. Their paper, entitled Creating and Manipulating a Laughlin-Type =1/3 Fractional Quantum Hall State on a Quantum Computer with Linear Depth Circuits, appears in the December issue of PRX Quantum, a journal of the American Physical Society.

Quantum physics is the fundamental theory of nature which leads to formation of molecules and the resulting matter around us, said Ghaemi, assistant professor in CCNYs Division of Science. It is already known that when we have a macroscopic number of quantum particles, such as electrons in the metal, which interact with each other, novel phenomena such as superconductivity emerge.

However, until now, according to Ghaemi, tools to study systems with large numbers of interacting quantum particles and their novel properties have been extremely limited.

Our research has developed a quantum algorithm which can be used to study a class of many-electron quantum systems using quantum computers. Our algorithm opens a new venue to use the new quantum devices to study problems which are quite challenging to study using classical computers. Our results are new and motivate many follow up studies, added Ghaemi.

On possible applications for this advancement, Ghaemi, whos also affiliated with the Graduate Center, CUNY noted: Quantum computers have witnessed extensive developments during the last few years. Development of new quantum algorithms, regardless of their direct application, will contribute to realizeapplications of quantum computers.

I believe the direct application of our results is to provide tools to improve quantum computing devices. Their direct real-life applicationwould emerge when quantum computers can be used for daily life applications.

His collaborators included scientists from: Western Washington University, University of California, Santa Barbara; Google AI Quantum and theUniversity of Michigan, Ann Arbor.

About the City College of New YorkSince 1847, The City College of New York has provided a high-quality and affordable education to generations of New Yorkers in a wide variety of disciplines. CCNY embraces its position at the forefront of social change. It is ranked #1 by the Harvard-based Opportunity Insights out of 369 selective public colleges in the United States on the overall mobility index. This measure reflects both access and outcomes, representing the likelihood that a student at CCNY can move up two or more income quintiles. In addition, the Center for World University Rankings places CCNY in the top 1.8% of universities worldwide in terms of academic excellence. Labor analytics firm Emsi puts at $1.9 billion CCNYs annual economic impact on the regional economy (5 boroughs and 5 adjacent counties) and quantifies the for dollar return on investment to students, taxpayers and society. At City College, more than 16,000 students pursue undergraduate and graduate degrees in eight schools and divisions, driven by significant funded research, creativity and scholarship. CCNY is as diverse, dynamic and visionary as New York City itself. View CCNY Media Kit.

See original here:
CCNY & partners in quantum algorithm breakthrough | The City College of New York - The City College of New York News

NTTs Kazuhiro Gomi says Bio Digital Twin, quantum computing the next-gen tech – Backend News

At the recently concluded Philippine Digital Convention (PH Digicon 2020) by PLDT Enterprise, Kazuhiro Gomi, president and CEO, NTT Research, shared the fundamental research milestones coming out of its three labs: the Physics and Informatics (PHI) Lab, the Cryptography and Information Security (CIS) Lab, and the Medical and Health Informatics (MEI) Lab, that are hoped to lead to monumental tech innovations.

The three-day virtual convention drew in more than 3,000 views during the live stream broadcast of the plenary sessions and breakout sessions covering various topics.

Gomi headlined the second day with his topic Upgrading Reality, a glimpse into breakthrough research that NTT Research is currently working on that could hasten digital transformations.

PLDT sets up Data Privacy and Information Security Committee

PLDT Home broadband service expands 46% nationwide

In a discussion with Cathy Yap-Yang, FVP and head Corporate Communications, PLDT, Gomi elaborated on next-generation technologies, particularly the Bio Digital Twin project, that could potentially be game-changing in the medical field, quantum computing, and advanced cryptography.

Bido Digital Twin

The Bio Digital Twin is an initiative where a digital replica of a patients internal system functions first as a model for possible testing of procedures and chemical reactions and seeing possible results before actual application to the person.

We are trying to create an electronic replica of the human body. If we are able to create something like that, the future of clinical and medical activities will be very different, Gomi said. If we have a precise replica of your human body, you can predict what type of disease or what type of problem you might have maybe three years down the road. Or, if your doctor needs to test a new drug for you, he can do so onto the digital twin.

NTT Research is a fundamental research organization in Silicon Valley that carries out advanced research for some of the worlds most important and impactful technologies, including quantum computing, cryptography, information security, and medical and health informatics.

Computing power

However, to get there and make the Bio Digital Twin possible, there are hurdles from various disciplines, including the component of computing power.

Gomi explained that people believed that todays computers can do everything, but in reality, it might actually take years to solve complex problems, whereas a quantum computer could solve these problems in seconds.

There are different kinds of quantum computers, but all are based upon quantum physics. At NTT Research, Gomi revealed that their group is working on a quantum computer called a coherent Ising machine which could solve combinatorial optimization problems.

We may be able to bring those superfast machines to market, to reality, much quicker. That is what we are aiming for, he said.

Basically, the machine, using many parameters and complex optimization, finds the best solution in a matter of seconds which may take months or years using conventional computers.

Some examples where quantum computing may be applied include lead optimization problems such as effects on small molecule drugs, peptide drugs, and Biocatalyst, or resource optimization challenges such as logistics, traffic control, or using wireless networks. Gomi also expounded on compressed sensing cases, including use in astronomical telescopes, magnetic resonance imaging (MRI), and computed tomography.

Quantum computing

Apart from quantum computing, Gomi reiterated the issues of cybersecurity and privacy. Today, encryption is able to address those challenges but it would soon require a more advanced and sophisticated type of technology if we are to upgrade reality.

From the connected world, obviously we want to exchange more data among each other, but we have to make sure that security and privacy are maintained. We have to have those things together to get the best out of a connected world, he said.

Among next-generation advanced encryptions, Gomi highlighted Attribute-Based Encryption where various decryption keys define access control of the encrypted data. For example, depending on the user (or the type of key he/she has) what they are allowed to view is different or controlled by the key issuers.

He noted that in the next couple of years, we should be able to commercialize this type of technology. We can maintain privacy while encouraging the sharing of data with this mechanism.

Gomi reiterated that we are at the stage of all kinds of digital transformations.

Digital transformation

Those digital transformations are making our lives so much richer and business so much more interesting and efficient. I would imagine those digital transformations will continue to advance even more, he said.

However, there are limiting factors that could impede or slow down those digital transformations such as energy consumption, Moores law of limitation as we cannot expect too much of the capacities of the electronic chips from current computers, and the issues on privacy and security. Hence, we need to address those factors.

PH Digicon 2020 is the annual convention organized by PLDT Enterprise which gathered global industry leaders to speak on the latest advancements in the digital landscape. This years roster of speakers included tech experts and heads from Cisco, Nokia, Salesforce, NTT Research, and goop CEO and multi-awarded Hollywood actress Gwyneth Paltrow who headlined the first virtual run.

Related

Read the rest here:
NTTs Kazuhiro Gomi says Bio Digital Twin, quantum computing the next-gen tech - Backend News

Safe Quantum Consulting Poised to Speed Adoption of Quantum-Based Encryption Solutions – PR Web

John Prisco, CEO, Safe Quantum Inc.

BETHESDA, Md. (PRWEB) November 10, 2020

Companies facing security risks have never had more to lose, and quantum computing could dramatically alter the security field, according to John J. Prisco, founder of Safe Quantum Inc., a new consultancy that works with corporate IT security teams to design, develop and deploy quantum-safe technologies that will protect their most critical data and intellectual property.

Quantum computing is no longer just theoretical its here, and any corporation that relies on valued information is in the crosshairs for a quantum attack, said Prisco, who has specialized in security and telecommunications networks over the span of his more than 30-year career.

Quantum computers behave differently from traditional computers, which rely on serial processing of 1s and 0s. Quantum computers use the physics of entanglement and superposition to create seemingly unbreakable cryptographic chains. This means they can produce parallel data processing at speeds that are incomparable they can solve a computation in minutes that would take a conventional computer thousands of years.

Prisco was most recently president and CEO at Quantum Xchange, Inc., a quantum technology startup focused on the future of secure key exchange and encryption.

The first and biggest threat from quantum computers is security, Prisco said. Quantum Keys (QKD), quantum-safe math algorithms (PQC) and quantum random numbers (QRNG) provide a menu of encryption approaches that create a defense-in-depth strategy for securing critical data transmissions. This means that companies dont have to execute capital-intensive rip and replace security projects to get the benefits of quantum-safe solutions. We are showing them how, and we are emphasizing that they need to get started now.

QKD is a method of secure communication over physical networks that enables two parties to create a shared but randomly generated secret key. This key is used to encrypt and de-crypt messages or files any kind of data transfer. But the real secret is that QKD can detect a third-party eavesdropping on the connection, trying to gain information about the encryption key. Using inherent quantum mechanics and superposition, QKD ensures the key is provably random, and therefore unbreakable. To put this in context, traditional public-key cryptography uses mathematical algorithms that can be determined and, eventually, broken.

John has a facility with scientific thinking rarely encountered among business people, said Dr. Whitfield Diffie, Turing Laureate and co-inventor of public-key cryptography.

Earlier in his career, John led Triumfant, the first cybersecurity company to perfect anomaly-detection techniques to identify and remediate advanced threats in computer memory systems. He has delivered successful exits for investors at Triumfant, GeoVantage and Ridgeway Systems, among others.

He began his career in telecommunication, first by founding Penn Access, a competitive local access carrier in Pittsburgh, and later by leading 2nd Century Communications, the nations first packet-based competitive local exchange carrier, and eLink Communications, a building local exchange carrier in New York City.

Prisco earned a master of science degree in quantum mechanics from the Massachusetts Institute of Technology after a bachelors of science degree in electrical and electronics engineering from Columbia University.

About Safe Quantum Inc.For more information, meet John J. Prisco.

Share article on social media or email:

See the original post here:
Safe Quantum Consulting Poised to Speed Adoption of Quantum-Based Encryption Solutions - PR Web

Quantum Computing in the CloudCan It Live Up to the Hype? – Electronic Design

What youll learn:

Quantum computing has earned its place on the Gartner hype cycle. Pundits have claimed that it will take over and change everything forever. The reality will likely be somewhat less dramatic, although its fair to say that quantum computers could spell the end for conventional cryptography. Clearly, this has implications for technologies like blockchain, which are slated to support financial systems of the future.

While the Bitcoin system, for example, is calculated to keep classical mining computers busy until 2140, brute-force decryption using a quantum computer could theoretically mine every token almost instantaneously. More powerful digital ledger technologies based on quantum cryptography could level the playing field.

All of this presupposes that quantum computing will become usable and affordable on a widespread scale. As things stand, this certainly seems achievable. Serious computing players, including IBM, Honeywell, Google, and Microsoft, as well as newer specialist startups, all have active programs that are putting quantum computing in the cloud right now and inviting engagement from the wider computing community. Introduction packs and development kits are available to help new users get started.

Democratizing Access

These are important moves that will almost certainly drive further advancement as users come up with more diverse and demanding workloads and figure out ways of handling them using quantum technology. Equally important is the anticipated democratizing effect of widespread cloud access, which should bring more people from a wider variety of backgrounds into contact with quantum to understand it, use it, and influence its ongoing development.

Although its here, quantum computing remains at a very experimental stage. In the future, commercial cloud services could provide affordable access in the same way that scientific or banking organizations can today rent cloud AI applications to do complex workloads that are billed according to the number of computer cycles used.

Hospitals, for example, are taking advantage of genome sequencing apps hosted on AI accelerators in hyperscale data centers to identify genetic disorders in newborn babies. The process costs just a few dollars and the results are back within minutes, enabling timely and potentially life-saving intervention by clinicians.

Quantum computing as a service could further transform healthcare as well as deeply affect many other fields such as materials science. Simulating a caffeine molecule, for example, is incredibly difficult to do with a classical computer, demanding the equivalent of over 100 years of processing time. A quantum computer can complete the task in seconds. Other applications that could benefit include climate analysis, transportation planning, bioinformatics, financial services, encryption, and codebreaking.

A Real Technology Roadmap

For all its power, quantum computing isnt here to kill off classical computing or turn the entire world upside down. Because quantum bits (qubits) can be in both states, 0 and 1, unlike conventional binary bits that are in one state or another, they can store exponentially more information. However, their state when measured is determined by probability, so quantum is only suited to certain types of algorithms. Others can be handled better by classical computers.

In addition, building and running a quantum computer is incredibly difficult and complex. On top of that, the challenges intensify as we try to increase the number of qubits in the system. As with any computer, more bits corresponds to more processing power, so increasing the number of bits is a key objective for quantum-computer architects.

Keeping the system stable, with a low error rate, for longer periods is another objective. One way to achieve this is by cryogenically cooling the equipment to near absolute zero to eliminate thermal noise. Furthermore, extremely pure and clean RF sources are needed. Im excited that, at Rohde & Schwarz, we are working with our academic partners to apply our ultra-low-noise R&S SGS100A RF sources (Fig. 1) to help increase qubit count and stability.

1. Extremely pure and clean RF sources like the R&S SGS100A are needed in quantum-computing applications.

The RF source is one of the most important building blocks as it determines the amount of errors that must be corrected in the process of reading out the quantum-computation results. A cleaner RF signal increases quantum-system stability, reducing errors due to quantum decoherence that would result in information loss.

Besides the low phase and amplitude noise requirements, multichannel solutions are essential to scale up the quantum-computing system. Moreover, as we start to consider scalability, a small form factor of the signal sources becomes even more relevant. Were combining our RF expertise with the software and system know-how of our partners in pursuit of a complete solution.

Equipment Needs

In addition, scientists are constantly looking for new material to be applied in quantum-computing chips and need equipment to help them accurately determine the exact properties. Then, once the new quantum chip is manufactured, its resonance frequencies must be measured to ensure that no undesired resonances exist. Rohde & Schwarz has developed high-performance vector network analyzers (Fig. 2) for both tasks and can assist in the debugging of the quantum-computing system itself.

2. VNAs such as the R&S ZNA help determine properties of material used in quantum computing.

Our partners are relying on us to provide various other test-and-measurement solutions to help them increase the performance and capabilities of quantum computers. The IQ mixing is a crucial part of a quantum computer, for example, and our spectrum analyzers help to characterize and calibrate the IQ mixers and suppress undesired sidebands. Moreover, R&S high-speed oscilloscopes (Fig. 3) help enable precise temporal synchronization of signals in the time domain, which is needed to set up and debug quantum-computing systems.

3. High-speed oscilloscopes, for example, the R&S RTP, can be used to set up and debug quantum-computing systems.

As we work with our partners in the quantum world to improve our products for a better solution fit, at the same time were learning how to apply that knowledge to other products in our portfolio. In turn, this helps to deliver even better performing solutions.

While cloud access will enable more companies and research institutes to take part in the quantum revolution, bringing this technology into the everyday requires a lot more work on user friendliness. That involves moving away from the temperature restrictions, stabilizing quantum computers with a high number of qubits, and all for a competitive price.

Already, however, we can see that quantum has the potential to profoundly change everything it touches. No hype is needed.

Sebastian Richter is Vice President of Market Segment ICR (Industry, Components, Research & Universities) at Rohde & Schwarz.

More here:
Quantum Computing in the CloudCan It Live Up to the Hype? - Electronic Design