Archive for the ‘Quantum Computing’ Category

Alan Turing’s Everlasting Contributions to Computing, AI and Cryptography – NIST

An enigma machine on display outside the Alan Turing Institute entrance inside the British Library, London.

Credit: Shutterstock/William Barton

Suppose someone asked you to devise the most powerful computer possible. Alan Turing, whose reputation as a central figure in computer science and artificial intelligence has only grown since his untimely death in 1954, applied his genius to problems such as this one in an age before computers as we know them existed. His theoretical work on this problem and others remains a foundation of computing, AI and modern cryptographic standards, including those NIST recommends.

The road from devising the most powerful computer possible to cryptographic standards has a few twists and turns, as does Turings brief life.

Alan Turing

Credit: National Portrait Gallery, London

In Turings time, mathematicians debated whether it was possible to build a single, all-purpose machine that could solve all problems that are computable. For example, we can compute a cars most energy-efficient route to a destination, and (in principle) the most likely way in which a string of amino acids will fold into a three-dimensional protein. Another example of a computable problem, important to modern encryption, is whether or not bigger numbers can be expressed as the product of two smaller numbers. For example, 6 can be expressed as the product of 2 and 3, but 7 cannot be factored into smaller integers and is therefore a prime number.

Some prominent mathematicians proposed elaborate designs for universal computers that would operate by following very complicated mathematical rules. It seemed overwhelmingly difficult to build such machines. It took the genius of Turing to show that a very simple machine could in fact compute all that is computable.

His hypothetical device is now known as a Turing machine. The centerpiece of the machine is a strip of tape, divided into individual boxes. Each box contains a symbol (such as A,C,T, G for the letters of genetic code) or a blank space. The strip of tape is analogous to todays hard drives that store bits of data. Initially, the string of symbols on the tape corresponds to the input, containing the data for the problem to be solved. The string also serves as the memory of the computer. The Turing machine writes onto the tape data that it needs to access later in the computation.

Credit: NIST

The device reads an individual symbol on the tape and follows instructions on whether to change the symbol or leave it alone before moving to another symbol. The instructions depend on the current state of the machine. For example, if the machine needs to decide whether the tape contains the text string TC it can scan the tape in the forward direction while switching among the states previous letter was T and previous letter was not C. If while in state previous letter was T it reads a C, it goes to a state found it and halts. If it encounters the blank symbol at the end of the input, it goes to the state did not find it and halts. Nowadays we would recognize the set of instructions as the machines program.

It took some time, but eventually it became clear to everyone that Turing was right: The Turing machine could indeed compute all that seemed computable. No number of additions or extensions to this machine could extend its computing capability.

To understand what can be computed it is helpful to identify what cannot be computed. Ina previous life as a university professor I had to teach programming a few times. Students often encounter the following problem: My program has been running for a long time; is it stuck? This is called the Halting Problem, and students often wondered why we simply couldnt detect infinite loops without actually getting stuck in them. It turns out a program to do this is an impossibility. Turing showed that there does not exist a machine that detects whether or not another machine halts. From this seminal result followed many other impossibility results. For example, logicians and philosophers had to abandon the dream of an automated way of detecting whether an assertion (such as whether there are infinitely many prime numbers) is true or false, as that is uncomputable. If you could do this, then you could solve the Halting Problem simply by asking whether the statement this machine halts is true or false.

Turing went on to make fundamental contributions to AI, theoretical biology and cryptography. His involvement with this last subject brought him honor and fame during World War II, when he played a very important role in adapting and extending cryptanalytic techniques invented by Polish mathematicians. This work broke the German Enigma machine encryption, making a significant contribution to the war effort.

Turing was gay. After the war, in 1952, the British government convicted him for having sex with a man. He stayed out of jail only by submitting to what is now called chemical castration. He died in 1954 at age 41 by cyanide poisoning, which was initially ruled a suicide but may have been an accident according to subsequent analysis. More than 50 years would pass before the British government apologized and pardoned him (after years of campaigning by scientists around the world). Today, the highest honor in computer sciences is called the Turing Award.

Turings computability work provided the foundation for modern complexity theory. This theory tries to answer the question Among those problems that can be solved by a computer, which ones can be solved efficiently? Here, efficiently means not in billions of years but in milliseconds, seconds, hours or days, depending on the computational problem.

For example, much of the cryptography that currently safeguards our data and communications relies on the belief that certain problems, such as decomposing an integer number into its prime factors, cannot be solved before the Sun turns into a red giant and consumes the Earth (currently forecast for 4 billion to 5 billion years). NIST is responsible for cryptographic standards that are used throughout the world. We could not do this work without complexity theory.

Technology sometimes throws us a curve, such as the discovery that if a sufficiently big and reliable quantum computer is built it would be able to factor integers, thus breaking some of our cryptography. In this situation, NIST scientists must rely on the worlds experts (many of them in-house) in order to update our standards. There are deep reasons to believe that quantum computers will not be able to break the cryptography that NIST is about to roll out. Among these reasons is that Turings machine can simulate quantum computers. This implies that complexity theory gives us limits on what a powerful quantum computer can do.

But that is a topic for another day. For now, we can celebrate how Turing provided the keys to much of todays computing technology and even gave us hints on how to solve looming technological problems.

Original post:
Alan Turing's Everlasting Contributions to Computing, AI and Cryptography - NIST

Quantum computing: D-Wave shows off prototype of its next quantum annealing computer – ZDNet

Image: Wacomka/Shutterstock

Quantum-computing outfit D-Wave has announced commercial access to an "experimental prototype" of its Advantage2 quantum annealing computer.

D-Wave is beating its own path to qubit processors with its quantum annealing approach. According to D-Wave, the Advantage2 prototype available today features over 500 qubits. It's a preview of a much larger Advantage2 it hopes to be available by 2024 with 7,000 qubits.

Access to the Advantage2 prototype is restricted to customers who have a D-Wave's Leap cloud service subscription, but developers interested in trying D-Wave's quantum cloud can sign up to get "one minute of free use of the actual quantum processing units (QPUs) and quantum hybrid solvers" that run on its earlier Advantage QPU.

The Advantage2 prototype is built with D-Wave's Zephyr connection technology that it claims offers higher connectivity between qubits than its predecessor topology called Pegasus, which is used in its Advantage QPU.

D-Wave says the Zephyr design enables shorter chains in its Advantage2 quantum chips, which can make them friendlier for calculations that require extra precision.

SEE:What is quantum computing? Everything you need to know about the strange world of quantum computers

"The Advantage2 prototype is designed to share what we're learning and gain feedback from the community as we continue to build towards the full Advantage2 system," says Emile Hoskinson, director of quantum annealing products at D-Wave.

"With Advantage2, we're pushing that envelope again demonstrating that connectivity and reduction in noise can be a delivery vehicle for even greater performance once the full system is available. The Advantage2 prototype is an opportunity for us to share our excitement and give a sneak peek into the future for customers bringing quantum into their applications."

While quantum computing is still experimental, senior execs are priming up for it as a business disruptor by 2030, according to a survey by consultancy EY. The firm found found that 81% of senior UK executives expect quantum computing to play a significant role in their industry by 2030.

Fellow consultancy McKinsey this month noted funding for quantum technology startups doubled in the past two years, from $700 million in 2020 to $1.4 billion in 2021. McKinsey sees quantum computing shaking up pharmaceuticals, chemicals, automotive, and finance industries, enabling players to "capture nearly $700 billion in value as early as 2035" through improved simulation and better machine learning. It expects revenues from quantum computing to exceed $90 billion by 2040.

D-Wave's investors include PSP Investments, Goldman Sachs, BDC Capital, NEC Corp, Aegis Group Partners, and the CIA's VC firm, In-Q-Tel.

Continue reading here:
Quantum computing: D-Wave shows off prototype of its next quantum annealing computer - ZDNet

Chicago Quantum Exchange takes first steps toward a future that could revolutionize computing, medicine and cybersecurity – Chicago Tribune

Flashes of what may become a transformative new technology are coursing through a network of optic fibers under Chicago.

Researchers have created one of the worlds largest networks for sharing quantum information a field of science that depends on paradoxes so strange that Albert Einstein didnt believe them.

The network, which connects the University of Chicago with Argonne National Laboratory in Lemont, is a rudimentary version of what scientists hope someday to become the internet of the future. For now, its opened up to businesses and researchers to test fundamentals of quantum information sharing.

The network was announced this week by the Chicago Quantum Exchange which also involves Fermi National Accelerator Laboratory, Northwestern University, the University of Illinois and the University of Wisconsin.

People work in the Pritzker Nanofabrication Facility, June 15, 2022, inside the William Eckhardt Research Center at the University of Chicago. The Chicago Quantum Exchange is expanding its quantum network to make it available to more researchers and companies. Quantum computing is a pioneering, secure format said to be hacker-proof and of possible use by banks, the health care industry, and others for secure communications. (Erin Hooley / Chicago Tribune)

With a $500 million federal investment in recent years and $200 million from the state, Chicago, Urbana-Champaign, and Madison form a leading region for quantum information research.

Why does this matter to the average person? Because quantum information has the potential to help crack currently unsolvable problems, both threaten and protect private information, and lead to breakthroughs in agriculture, medicine and climate change.

While classical computing uses bits of information containing either a 1 or zero, quantum bits, or qubits, are like a coin flipped in the air they contain both a 1 and zero, to be determined once its observed.

That quality of being in two or more states at once, called superposition, is one of the many paradoxes of quantum mechanics how particles behave at the atomic and subatomic level. Its also a potentially crucial advantage, because it can handle exponentially more complex problems.

Another key aspect is the property of entanglement, in which qubits separated by great distances can still be correlated, so a measurement in one place reveals a measurement far away.

The newly expanded Chicago network, created in collaboration with Toshiba, distributes particles of light, called photons. Trying to intercept the photons destroys them and the information they contain making it far more difficult to hack.

The new network allows researchers to push the boundaries of what is currently possible, said University of Chicago professor David Awschalom, director of the Chicago Quantum Exchange.

Fourth-year graduate student Cyrus Zeledon, left, and postdoctoral student Leah Weiss, right, show senior undergraduate Tiarna Wise around one of the quantum science laboratories, June 15, 2022, inside the William Eckhardt Research Center at the University of Chicago. (Erin Hooley / Chicago Tribune)

However, researchers must solve many practical problems before large-scale quantum computing and networking are possible.

For instance, researchers at Argonne are working on creating a foundry where dependable qubits could be forged. One example is a diamond membrane with tiny pockets to hold and process qubits of information. Researchers at Argonne also have created a qubit by freezing neon to hold a single electron.

Because quantum phenomena are extremely sensitive to any disturbance, they might also be used as tiny sensors for medical or other applications but theyd also have to be made more durable.

The quantum network was launched at Argonne in 2020, but has now expanded to Hyde Park and opened for use by businesses and researchers to test new communication devices, security protocols and algorithms. Any venture that depends on secure information, such as banks financial records of hospital medical records, would potentially use such a system.

Quantum computers, while in development now, may someday be able to perform far more complex calculations than current computers, such as folding proteins, which could be useful in developing drugs to treat diseases such as Alzheimers.

In addition to driving research, the quantum field is stimulating economic development in the region. A hardware company, EeroQ, announced in January that its moving its headquarters to Chicago. Another local software company, Super.tech, was recently acquired, and several others are starting up in the region.

Because quantum computing could be used to hack into traditional encryption, it has also attracted the bipartisan attention of federal lawmakers. The National Quantum Initiative Act was signed into law by President Donald Trump in 2018 to accelerate quantum development for national security purposes.

In May, President Joe Biden directed federal agency to migrate to quantum-resistant cryptography on its most critical defense and intelligence systems.

Ironically, basic mathematical problems, such as 5+5=10, are somewhat difficult through quantum computing. Quantum information is likely to be used for high-end applications, while classical computing will likely continue to be practical for many daily uses.

Renowned physicist Einstein famously scoffed at the paradoxes and uncertainties of quantum mechanics, saying that God does not play dice with the universe. But quantum theories have been proven correct in applications from nuclear energy to MRIs.

Stephen Gray, senior scientist at Argonne, who works on algorithms to run on quantum computers, said quantum work is very difficult, and that no one understands it fully.

But there have been significant developments in the field over the past 30 years, leading to what some scientists jokingly called Quantum 2.0, with practical advances expected over the next decade.

Were betting in the next five to 10 years therell be a true quantum advantage (over classical computing), Gray said. Were not there yet. Some naysayers shake their canes and say its never going to happen. But were positive.

Just as early work on conventional computers eventually led to cellphones, its hard to predict where quantum research will lead, said Brian DeMarco, professor of physics at the University of Illinois at Urbana-Champaign, who works with the Chicago Quantum Exchange.

Thats why its an exciting time, he said. The most important applications are yet to be discovered.

rmccoppin@chicagotribune.com

Follow this link:
Chicago Quantum Exchange takes first steps toward a future that could revolutionize computing, medicine and cybersecurity - Chicago Tribune

Businesses brace for quantum computing disruption by end of decade – The Register

While business leaders expect quantum computing to play a significant role in industry by 2030, some experts don't believe the tech is going to be ready for production deployment in the near future.

The findings, from a survey titled "2022 Quantum Readiness" commissioned by consultancy EY, refer to UK businesses, although it is likely that the conclusions are equally applicable to global organizations.

According to EY, 81 percent of senior UK executives expect quantum computing to have a significant impact in their industry within seven and a half years, with almost half (48 percent) believing that quantum technology will begin to transform industries as soon as 2025.

As for the naysayers who say quantum tech won't be ready for live deployment any time soon, the industry also suffers from a hype problem, with capabilities being exaggerated and even some accusations flying around of alleged falsification, as with the example of quantum startup IonQ that was recently accused by Scorpion Capital of misleading investors about the effectiveness of its quantum hardware.

Joseph Reger, Fujitsu Fellow, CTO of Central and Eastern Europe and Member of Quantum Computing Council of World Economic Forum, told The Register he is getting some "heat" for saying quantum is not nearly a thing yet.

"There are impressive advantages that pre-quantum or quantum-inspired technologies provide. They are less sexy, but very powerful."

He added: "Some companies are exaggerating the time scales. If quantum computing gets overhyped, we are likely to face the first quantum winter."

Fujitsu is itself developing quantum systems, and announced earlier this year that it was working to integrate quantum computing with traditional HPC technology. The company also unveiled a high performance quantum simulator based on its PRIMEHPC FX 700 systems that it said will serve as an important bridge towards the development of quantum computing applications in future.

Meanwhile, EY claims that respondents were "almost unanimous" in their belief that quantum computing will create a moderate or high level of disruption for their own organization, industry sector, and the broader economy in the next five years.

Despite this, the survey finds that strategic planning for quantum computing is still at an embryonic stage for most organizations, with only 33 percent involved in strategic planning for how quantum will affect them and only a quarter have appointed specialist leaders or set up pilot teams.

The survey conducted in February-March 2022 covered 501 UK-based executives, all with senior roles in their organisations, who had to demonstrate at least a moderate (but preferably a high) level of understanding of quantum computing. EY said they originally approached 1,516 executives, but only 501 met this requirement, which in and of itself tells a tale.

EY's Quantum Computing Leader, Piers Clinton-Tarestad, said the survey reveals a disconnect between the pace at which some industry leaders expect quantum to start affecting business and their preparedness for those impacts.

"Maximizing the potential of quantum technologies will require early planning to build responsive and adaptable organisational capabilities," he said, adding that this is a challenge because the progress of quantum has accelerated, but it is "not following a steady trajectory."

For example, companies with quantum processors have increased the power of their hardware dramatically over the past several years, from just a handful of qubits to over a hundred in the case of IBM, which expects to deliver a 4,158-qubit system by 2025. Yet despite these advances, quantum computers remain a curiosity, with most operational systems deployed in research laboratories or made available via a cloud service for developers to experiment with.

Clinton-Tarestad said "quantum readiness" is "not so much a gap to be assessed as a road to be walked," with the next steps in the process being regularly revisited as the landscape evolves. He warned businesses that expect to see disruption in their industry within the next three or five years need to act now.

According to EY's report, executives in consumer and retail markets are those most likely to believe that quantum will play a significant role by 2025, with just over half of technology, media and telecommunications (TMT) executives expecting an impact within the same time frame. Most respondents among health and life sciences companies think this is more likely to happen later, between 2026 and 2035.

Most organizations surveyed expect to start their quantum preparations within the next two years, with 72 percent aiming to start by 2024.

However, only a quarter of organizations have got as far as recruiting people with the necessary skills to lead quantum computing efforts, although 68 percent said they are aiming to set up pilot teams to explore the potential of quantum for their business by 2024.

Fear of falling behind because rival companies are working to develop their own quantum capabilities is driving some respondents to start quantum projects, while the applications of quantum computing anticipated by industry leaders would advance operations involving AI and machine learning, especially among financial services, automotive and manufacturing companies. TMT respondents cited potential applications in cryptography and encryption as being the most likely use of quantum computing.

While the EY report warns about companies potentially losing out to rivals on the benefits of quantum computing, there are also dangers that organizations should be preparing for now, as Intel warned about during its Intel Vision conference last month.

One of these is that quantum computers could be used to break current cryptographic algorithms, meaning that the confidentiality of both personal and enterprise data could be at risk. This is not a far-off threat, but something that organizations need to consider right now, according to Sridhar Iyengar, VP of Intel Labs and Director of Security and Privacy Research.

"Adversaries could be harvesting encrypted data right now, so that they can decrypt it later when quantum computers are available. This could be sensitive data, such as your social security number or health records, which are required to be protected for a long period of time," Iyengar told us.

Organizations may want to address threats like this by taking steps such as evaluating post-quantum cryptography algorithms and increasing the key sizes for current crypto algorithms like AES.

Or they may simply decide to adopt a wait and see attitude. EY will no doubt be on hand to sell consultancy services to help clarify their thinking.

Go here to read the rest:
Businesses brace for quantum computing disruption by end of decade - The Register

McKinsey thinks quantum computing could create $80b in revenue … eventually – The Register

In the hype-tastic world of quantum computing, consulting giant McKinsey & Company claims that the still-nascent field has the potential to create $80 billion in new revenue for businesses across industries.

It's a claim McKinsey has repeated nearly two dozen times on Twitter since March to promote its growing collection of research diving into various aspects of quantum computing, from startup and government funding to use cases and its potential impact on a range of industries.

The consulting giant believes this $80 billion figure represents the "value at stake" for quantum computing players but not the actual value that use cases could create [PDF]. This includes companies working in all aspects of quantum computing, from component makers to service providers.

Despite wildly optimistic numbers, McKinsey does ground the report in a few practical realities. For instance, in a Wednesday report, the firm says the hardware for quantum systems "remains too immature to enable a significant number of use cases," which, in turn, limits the "opportunities for fledgling software players." The authors add that this is likely one of the reasons why the rate of new quantum startups entering the market has begun to slow.

Even the top of McKinsey's page for quantum computing admits that capable systems won't be ready until 2030, which is in line with what various industry players, including Intel, are expecting. Like fusion, it's always a decade or so away.

McKinsey, like all companies navigating if quantum computing has any real-world value, is trying to walk a fine line, exploring the possibilities of quantum computing while showing the ways the tech is still disconnected from ordinary enterprise reality.

"While quantum computing promises to help businesses solve problems that are beyond the reach and speed of conventional high-performance computers, use cases are largely experimental and hypothetical at this early stage. Indeed, experts are still debating the most foundational topics for the field," McKinsey wrote in a December 2021 article about how use cases "are getting real."

One could argue the report is something of a metaphor for the quantum industry in 2022. Wildl optimism about future ecosystem profitability without really understanding what the tech will mean and to whom--and at what scale.

Read the rest here:
McKinsey thinks quantum computing could create $80b in revenue ... eventually - The Register