Archive for the ‘Quantum Computing’ Category

Australia to buy quantum computer from US | Information Age | ACS – ACS

The Commonwealth is planning to build a quantum computer. Image: Shutterstock

EXCLUSIVE: The Commonwealth government is looking to buy a quantum computing system through a secret procurement process that is rumoured to favour a US-based company, leaving Australias quantum sector annoyed by the apparent snub.

Sources told Information Age the government has been looking to buy its first quantum computer from PsiQuantum, a California-based firm with a stated mission to build and deploy the worlds first useful quantum computer.

The Department of Industry and Science did not respond to Information Ages request for comment.

Australia has a wealth of local expertise in quantum technologies and has, for decades, been a world leader in the nascent fields research and development.

When Industry and Science Minister Ed Husic took office last year, he showed a public desire to take advantage of local talents, knowledge, and manufacturing capabilities to make Australia the quantum capital of the globe.

Indeed, Husics department led the development of Australias first quantum strategy.

But the governments apparent move to go overseas for what one insider described as Australias biggest ever investment in quantum, has been seen by many in the industry as a slap in the face.

Husics office did not respond to Information Ages request for comment.

One industry source, who wished to remain anonymous, questioned why there wasnt an open tender process and said they would have liked the opportunity to form a consortium of Australian companies to apply.

While they didnt disagree in principle with the idea of the Commonwealth buying a quantum computer, the quantum expert said a government decision to buy technology from a US-based company could negatively impact how the local industry is perceived by international investors and buyers.

The government has not previously stated an intention to buy a quantum computer. In this year's budget the Department of Industry and Science added around $20 million for a quantum commercialistation centre and $40 million for the Critical Technologies Challenges Program.

Internationally, government-funded quantum computing projects have proved expensive. The Finnish government last month committed $116 million (EU70 million) to scale up its 20 qubit system while Germany announced in May that it will pour around $5 billion (EU3 billion) to build a 100 qubit system by 2026.

Simon Devitt, a senior lecturer at the University of Technology Sydney and member of the governments National Quantum Advisory Committee, was willing to publicly state that he thinks the government buying as-yet-unproven technology is a ludicrous waste of money that would be better spent on funding to shore up local academic research.

These systems are often extremely expensive and their value is questionable at the very least, he told Information Age.

They do not provide any kind of commercial utility for HPC [high-performance computing], and the utility for developing quantum algorithms or in education is essentially non-existent.

Devitt could not speak to anything discussed in the National Quantum Advisory Committee.

Why quantum?

Quantum computers are probabilistic and can theoretically solve problems that would take a classical computer thousands of years to compute.

They have potential applications in areas like cryptography, finance, and pharmaceutical development, although quantum advantage the ability for one of these systems to outperform classical supercomputers has yet to be proven outside niche experimental settings.

Companies around the world are exploring different ways to create and maintain systems of sufficiently large, error-corrected quantum bits (qubits).

PsiQuantum is pursuing photonic quantum computing technology which involves storing and processing information using individual quanta of light.

The company claims its chips can be rigorously tested using industrial-scale facilities at room temperature which gives them an edge over technologies that must remain cryogenically cooled for longer parts of the testing phase.

Photonic quantum computing is not room temperature since photon detectors still need to be cooled to near absolute zero.

Individual quantum photonic chips may have fewer qubits than competing technologies, but using light as a foundation may allow a cluster of connected chips to pass quantum information between one another via fibre optic cables and scale-up systems with existing technology.

PsiQuantum has an Australian link through its CEO and co-founder Professor Jeremy OBrien who studied in Queensland and Western Australia and completed his PhD with the University of New South Wales.

The company is partnered with US semiconductor firm GlobalFoundries that produces PsiQuantums photonic chip wafers at an industrial scale.

PsiQuantum did not respond to Information Ages request for comment.

Read the rest here:
Australia to buy quantum computer from US | Information Age | ACS - ACS

7 Quantum Computing Stocks That AI Will Send Soaring – InvestorPlace

Quantum computing stocks represent an industry that has been around for a while. The field leverages quantum mechanics at subatomic scales and is being applied to boost computing speeds.

Quantum computing has, in the past few years, begun to get more and more attention. The sector really heated up during the pandemic pre-quantitative tightening. It cooled as rates increased and riskier lending became more expensive.

As we approach peak interest rates and an end nears, investors will begin to look at quantum computing stocks again. AI is a big part of the logic behind doing so. It will supercharge development in the field and could lead to breakthroughs. Thus, its a good idea to invest early in anticipation of the reemergence of quantum computing stocks.

Source: Shutterstock

Investors would be wise to consider Defiance Quantum ETF (NYSEARCA:QTUM) for a balanced, low-risk introduction to quantum computing stocks. Its largest holding is the second firm discussed in this article, IonQ, at 2.27%.

QTUM shares offer lower operating costs, ease of trading, transparency, and tax efficiency when compared to individual stocks. Defiance Quantum ETF tracks the BlueStar Quantum Computing and Machine Learning Index. Given its AI/ML exposure, it should be no surprise then that QTUM shares have appreciated quickly this year. Theyve returned 14.97% year-to-date and more than 21% over the last year.

Youll pay 0.4% in net expense ratio to have your investment managed by a portfolio manager. 1% is the net expense ratio considered too high generally speaking, so QTUM shares are not expensive.

QTUM shares have ranged as high as $53 over the last 52 weeks and as AI has cooled theyve fallen back to $44 at present. That is a good entry point for inexpensive exposure to the confluence of AI and quantum computing.

Source: Amin Van / Shutterstock.com

IonQ (NYSE:IONQ) is your best bet for maximum exposure to the growth of the quantum computing sector. The company does all quantum computing development and IPOd earlier this year through a special purpose acquisition company (SPAC).

To be clear, IonQ is in many ways, the opposite of QTUM, immediately above. Start-ups, SPACs, and emerging technology all under one roof equates to higher risk overall.

IonQ is heavily invested in cloud computing. The firm is partnered with the 3 major cloud firms. The confluence of quantum computing, AI, and cloud promises to produce real growth moving forward that can make investors a lot of money.

IonQs most powerful computer, called Aria, is being leveraged toward Amazons AWS services and its leading cloud. That makes it a strong bet overall given AWS dominance.

IonQ isnt making much money right now and is going to continue to invest heavily and incur large expenses for the near future, but if it pays off, itll pay off big.

Source: JHVEPhoto / Shutterstock.com

IBM (NYSE:IBM) stock has floundered over the last decade. Its a legacy technology firm that has lost its way and is trying to regain its former glory. IBMs strategy to do so includes a focus on AI, cloud, and a segment dedicated to quantum computing.

Those bets are paying off. The company has leveraged its Watson AI and focused on creating a suite of generative AI tools. Early booking data suggests that IBMs strategy is working and an annual run rate of $1 billion is expected after the firm bested expectations.

IBM has a dedicated quantum computing business unit, IBM Quantum. More than 200 firms and research organizations are using IBM Quantum to develop enterprise solutions in the field. IBM is aligned with the defense sector as it relates to quantum computing and AI. Other defense adjacent firms including Palantir (NYSE:PLTR) have soared this year as AI begins to take root in the national security realm.

Source: Shutterstock

FormFactor (NASDAQ:FORM) is a semiconductor firm that also makes cooling equipment used in quantum computing.

The stock benefits from trends that are just catching on and with chip demand likely to grow alongside AIs continued growth, FormFactor will grow. The company sells test equipment. Thus, its a picks-and-shovels play.

Outside of chip testing equipment, FormFactor also sells cryogenic systems. The so-called probe stations are chambers that are cooled to extremely low temperatures and used for testing chips for defects. Those same chambers have utility in quantum computing which also requires powerful chips.

FormFactor clearly benefits from secular trends. The long-term potential of the chip sector is high. Expectations of continued growth due to AI, machine learning, and quantum computing give FormFactor powerful catalysts overall.

Its cooling and testing equipment has every chance to be sold at higher volumes in the near future and its shares will ebb and flow with the chip sector.

Source: The Art of Pics / Shutterstock.com

Name a technology, and Microsoft (NASDAQ:MSFT) probably has some interest and exposure thereto.

Microsoft has laboratories and world-class researchers in any number of fields doing varied research. Quantum computing is part of that.

Hardware, software, specialized cooling equipment, and more are being developed by the company. If Microsoft decides that quantum computing is the next big thing, expect it to move first as it did with OpenAI and ChatGPT. Its an industry shaper so the fact that it is developing quantum computers is a signal worth watching.

Investing in Microsoft is not a strong investment in quantum computing per se. Quantum computing revenues are a very minor part of its business. Choose MSFT shares for the dozen other strengths it possesses but keep in mind that quantum computing is a part of it.

Azure is a major cloud provider. AI is being integrated there as fast as possible. Quantum computing promises to accelerate AI and could realistically compound the rapid shifts were already experiencing. That makes MSFT a strong bet and it offers much less risk than upstart firms in the space.

Source: Kate Krav-Rude / Shutterstock.com

Intel (NASDAQ:INTC) used to be the biggest chip stock. It isnt any longer following many missteps. That leaves Intel, like IBM, searching for its former glory.

The companys strategy to turn itself around rests on several pillars. The firms Arizona chip factories are a big part of that strategy.

Intel is positioning itself to take advantage of shoring up its efforts in the semiconductor industry. Construction of those factories is the major driver of its turnaround.

Intel is also developing a quantum computing chip called Tunnel Falls. The company is working with its partners to test that chip as part of its overall turnaround effort. As with the other large tech firms here, Intel isnt yet moving heavily into quantum computing. It remains a future technology that is part of a longer-term vision.

AI is the current focus along with reshoring. In time though, quantum computing chips like Tunnel Falls will play a bigger part in Intels turnaround story.

Source: josefkubes / Shutterstock.com

Honeywell (NASDAQ:HON) offers industrial software and is a stock thats commonly mentioned alongside megatrends like IoT and to a degree, AI.

The company recently reorganized into three business segments to take advantage of mega-trends. Itll now be automation, aviation, and energy transition that drive Honeywell overall.

The move is unlikely to change much of Honeywells dayto-day operations and it will continue to do much of the same things. Itll still be heavily focused on the IoT building automation opportunity. More and smarter chips will be required for that effort. In short, its the same industrial firm it was with a slightly revamped direction.

However, Honeywell is also a quantum computing firm which many people might not recognize. Honeywell built a quantum computing unit which was spun off, merged, and is now known as Quantinuum. Honeywell owns a 54% stake in that firm.

That means Honeywell is a lesser-known quantum computing firm with a vested interest in the continued development of the sector and a revenue-generating asset therein.

On the date of publication, Alex Sirois did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.

Alex Sirois is a freelance contributor to InvestorPlace whose personal stock investing style is focused on long-term, buy-and-hold, wealth-building stock picks. Having worked in several industries from e-commerce to translation to education and utilizing his MBA from George Washington University, he brings a diverse set of skills through which he filters his writing.

Original post:
7 Quantum Computing Stocks That AI Will Send Soaring - InvestorPlace

Breaking the Quantum Limit: From Einstein-Bohr Debates to Achieving Unattainable Efficiency – SciTechDaily

In the Barz groups experiment with a two-stage interferometer auxiliary photons are used to generate distinct measurement patterns for all four Bell states, increasing the efficiency beyond the traditional limit of 50%. Credit: Jon Heras, Cambridge Illustrators

Researchers at the University of Stuttgart have demonstrated that a key ingredient for many quantum computation and communication schemes can be performed with an efficiency that exceeds the commonly assumed upper theoretical limit thereby opening up new perspectives for a wide range of photonic quantum technologies.

Quantum science not only has revolutionized our understanding of nature, but is also inspiring groundbreaking new computing, communication, and sensor devices. Exploiting quantum effects in such quantum technologies typically requires a combination of deep insight into the underlying quantum-physical principles, systematic methodological advances, and clever engineering. And it is precisely this combination that researchers in the group of Prof. Stefanie Barz at the University of Stuttgart and the Center for Integrated Quantum Science and Technology (IQST) have delivered in recent study, in which they have improved the efficiency of an essential building block of many quantum devices beyond a seemingly inherent limit.

One of the protagonists in the field of quantum technologies is a property known as quantum entanglement. The first step in the development of this concept involved a passionate debate between Albert Einstein and Niels Bohr. In a nutshell, their argument was about how information can be shared across several quantum systems. Importantly, this can happen in ways that have no analog in classical physics.

The discussion that Einstein and Bohr started remained largely philosophical until the 1960s, when the physicist John Stewart Bell devised a way to resolve the disagreement experimentally. Bells framework was first explored in experiments with photons, the quanta of light. Three pioneers in this field Alain Aspect, John Clauser, and Anton Zeilinger were jointly awarded last years Nobel Prize in Physics for their groundbreaking works toward quantum technologies.

Bell himself died in 1990, but his name is immortalized not least in the so-called Bell states. These describe the quantum states of two particles that are as strongly entangled as is possible. There are four Bell states in all, and Bell-state measurements which determine which of the four states a quantum system is in are an essential tool for putting quantum entanglement to practical use. Perhaps most famously, Bell-state measurements are the central component in quantum teleportation, which in turn makes most quantum communication and quantum computation possible.

The experimental setup consists exclusively of so-called linear components, such as mirrors, beam splitters, and waveplates, which ensures scalability. Credit: La Rici Photography

But there is a problem: when experiments are performed using conventional optical elements, such as mirrors, beam splitters, and waveplates, then two of the four Bell states have identical experimental signatures and are therefore indistinguishable from each other. This means that the overall probability of success (and thus the success rate of, say, a quantum-teleportation experiment) is inherently limited to 50 percent if only such linear optical components are used. Or is it?

This is where the work of the Barz group comes in. As they recently reported in the journal Science Advances, doctoral researchers Matthias Bayerbach and Simone DAurelio carried out Bell-state measurements in which they achieved a success rate of 57.9 percent. But how did they reach an efficiency that should have been unattainable with the tools available?

Their outstanding result was made possible by using two additional photons in tandem with the entangled photon pair. It has been known in theory that such auxiliary photons offer a way to perform Bell-state measurements with an efficiency beyond 50 percent. However, experimental realization has remained elusive. One reason for this is that sophisticated detectors are needed that resolve the number of photons impinging on them.

Bayerbach and DAurelio overcame this challenge by using 48 single-photon detectors operating in near-perfect synchrony to detect the precise states of up to four photons arriving at the detector array. With this capability, the team was able to detect distinct photon-number distributions for each Bell state albeit with some overlap for the two originally indistinguishable states, which is why the efficiency could not exceed 62.5 percent, even in theory. But the 50-percent barrier has been busted. Furthermore, the probability of success can, in principle, be arbitrarily close to 100 percent, at the cost of having to add a higher number of ancilla photons.

Also, the most sophisticated experiment is plagued by imperfections, and this reality has to be taken into account when analyzing the data and predicting how the technique would work for larger systems. The Stuttgart researchers therefore teamed up with Prof. Dr. Peter van Loock, a theorist at the Johannes Gutenberg University in Mainz and one of the architects of the ancilla-assisted Bell-state measurement scheme. Van Loock and Barz are both members of the BMBF-funded PhotonQ collaboration, which brings together academic and industrial partners from across Germany working towards the realization of a specific type of photonic quantum computer. The improved Bell-state measurement scheme is now one of the first fruits of this collaborative endeavor.

Although the increase in efficiency from 50 to 57.9 percent may seem modest, it provides an enormous advantage in scenarios where a number of sequential measurements need to be made, for example in long-distance quantum communication. For such upscaling, it is essential that the linear-optics platform has a relatively low instrumental complexity compared to other approaches.

Methods such as those now established by the Barz group extend our toolset to make good use of quantum entanglement in practice opportunities that are being explored extensively within the local quantum community in Stuttgart and in Baden-Wrttemberg, under the umbrella of initiatives such as the long-standing research partnership IQST and the recently inaugurated network QuantumBW.

Reference: Bell-state measurement exceeding 50% success probability with linear optics by Matthias J. Bayerbach, Simone E. DAurelio, Peter van Loock and Stefanie Barz, 9 August 2023, Science Advances. DOI: 10.1126/sciadv.adf4080

The work was supported by the Carl Zeiss Foundation, the Centre for Integrated Quantum Science and Technology (IQST), the German Research Foundation (DFG), the Federal Ministry of Education and Research (BMBF, projects SiSiQ and PhotonQ), and the Federal Ministry for Economic Affairs and Climate Action (BMWK, project PlanQK).

Follow this link:
Breaking the Quantum Limit: From Einstein-Bohr Debates to Achieving Unattainable Efficiency - SciTechDaily

Multiverse Computing Wins UK Funding to Improve Flood Risk Assessment with Quantum Algorithms – HPCwire

LONDON, October 31, 2023 Multiverse Computing, a global leader in value-based quantum computing and machine learning solutions, along with Moodys Analytics and Oxford Quantum Circuits (OQC), have won funding from Innovate UK to use quantum methods to develop large-scale flood prediction models and remove limitations of traditional modeling methods.

The UK Department of Environment, Food and Rural Affairs is overseeing the project and will be the first customer to use this new solution in computational fluid dynamics in an effort to help the country better adapt to extreme weather events linked to climate change.

The three companies won a place in Phase 1 of this competitive process from the UK governments Quantum Catalyst Fund for their joint project, Quantum-Assisted Flood Modeling: Pioneering Large-Scale Analysis for Enhanced Risk Assessment. The project team will use quantum computing to address the computational challenges in large-scale flood modeling studies and to make flood risk assessment and management more accurate and efficient.

Multiverse Computing is the lead contractor and software provider for the project and will deliver the technical formulation of the problem and algorithm development. OQC will supply the quantum hardware and ancillary resources, while industry partner Moodys Analytics, a global risk management firm, will contribute industry expertise, data requirements, and insights on computational efficiency.

This is the first time Multiverse proposes to apply quantum algorithms to assess potential flood damage. The improvements to accuracy and effectiveness gained by the quantum approach to computational fluid dynamics problems could contribute to climate change adaptation efforts, according to Enrique Lizaso Olmos, founder and CEO of Multiverse Computing.

Understanding the changes in flood risk will help everyone prepare for extreme weather events, from government agencies working to mitigate those risks to homeowners trying to protect their homes and properties, as well as insurance agencies quantifying these new risks, said Lizaso Olmos.

Flood modelling involves running two-dimensional hydrodynamical models that numerically solve Shallow Water Equations (SWE), which describe the flow of water in all relevant scenarios such as dam breaks, storm surges, or river flood waves. The computational cost of running simulations with sophisticated models over large areas and high-resolution is a limiting factor for current methods. To counteract these limitations, parallel computing and GPU-based computing have been employed to expedite the simulation process.

The advent of new technologies, such as quantum computing, offers an exciting avenue for advancement, said Sergio Gago, Moodys Managing Director of Quantum and GenAI. Specifically, there is promising potential in the application of quantum machine learning (QML) to develop emulators as alternatives to traditional physics-based models.

Moodys RMS estimates that regardless of societys capacity to decrease carbon emissions, by 2050 average cost of flood risk in the UK will increase by at least 20%, with some strong variability by region, season and future emission scenario. RMS flood models estimate there are greater than 700 million ($850M USD) in losses each year in the country due to inland flooding, with large year-over-year volatility. The flood risk modeling work funded by Innovate UK could help to further improve our understanding of flood risk landscape across geographies and time of the year.

We recognise the urgency of addressing escalating flood risks which is why this project is so important to us, says Dr. Ilana Wisby, OQC CEO. By harnessing the power of OQCs quantum computing, were not only breaking free from the constraints of classical computing, together we are redefining the future of flood management and helping to create a safer, more resilient world for future generations.

The project team will use a Quantum Physics-Informed Neural Network (QPINN) algorithm to improve these risk assessment methods. The algorithm combines classical data processing with quantum processing using a Variational Quantum Circuit (VQC). The data is encoded into the quantum gate parameters of the VQC, and as the algorithm progresses, these parameters are adjusted to improve the accuracy of target function predictions. Phase 1 lasts three months and ends Nov. 30, 2023. The second phase of the project will last up to 15 months and starts in January 2024. Approval for Phase 2 is based on a successful completion of Phase 1.

This Small Business Research Initiative competition is funded by the Department for Science, Innovation and Technology (DSIT) and Innovate UK (IUK). The aim of this competition is to explore the benefit of using quantum technologies in various areas of interest for the UK Government, accelerating the adoption of quantum solutions by the public sector and for the public benefit. Thirty projects were awarded funding in Phase 1.

Part of the UK Research and Innovation (UKRI) government organization, Innovate UK supports business-led innovation in all sectors, technologies and regions of the country to help develop and commercialize new products, processes and services that enhance business growth.

About Moodys Analytics

Moodys (NYSE: MCO) is a global integrated risk assessment firm that empowers organizations to make better decisions. Its data, analytical solutions and insights help decision-makers identify opportunities and manage the risks of doing business with others. We believe that greater transparency, more informed decisions, and fair access to information open the door to shared progress. With approximately 14,000 employees in more than 40 countries, Moodys combines international presence with local expertise and over a century of experience in financial markets. Learn more at moodys.com/about.

About Oxford Quantum Circuits

OQC is a world-leading quantum computing company. It brings enterprise ready quantum solutions to its customers fingertips and enables them to make breakthrough discoveries. The companys quantum computers are available via data centres, private cloud and on Amazon Braket. For more information, visit http://www.oxfordquantumcircuits.com.

About Multiverse Computing

Multiverse Computing is a leading quantum software company that applies quantum and quantum-inspired solutions to tackle complex problems in finance, banking, manufacturing, energy, and cybersecurity to deliver value today and enable a more resilient and prosperous economy. The companys expertise in quantum algorithms and quantum-inspired algorithms means it can secure maximum results from current quantum devices as well as classical high-performance computers. Its flagship product, Singularity, allows professionals across all industries to leverage quantum computing to speed up and improve the accuracy of optimization and AI models with existing and familiar software tools. The company also has developed CompactifAI, a compressor which uses tensor networks to make large language models more efficient and portable. In addition to finance and AI, Multiverse serves enterprises in the mobility, energy, life sciences and industry 4.0 sectors. The company is based in San Sebastian, Spain, with branches in Toronto, Paris and Munich.

Source: Multiverse Computing

Read more:
Multiverse Computing Wins UK Funding to Improve Flood Risk Assessment with Quantum Algorithms - HPCwire

Tackling the challenges of quantum computing seriously – Shoosmiths

At the end of last week, the FT published a guest article on quantum computing.

For those unfamiliar with quantum computing, it is the technology that will be capable of harnessing the powers of quantum mechanics to solve problems which are too complex for classical computers (the computers of today).

Classical computing employs streams of electrical impulses to encode information: an electrical impulse may be only 1 or 0 (i.e. on or off) a classical 'bit. In quantum mechanics particles can exist in more than one state at a time. In binary terms, this means that a quantum bit (known as a "qubit") can be both 1 and 0 at the same time. If a computer can be built that harnesses this quantum mechanical phenomena, then it should be able to solve complex problems much faster than classical computers or problems too complex for classical computers to solve.

In 1994, Peter Shor (a mathematician) wrote an algorithm (known as Shor's Algorithm) that could crack the Rivest-Shamir-Adleman (RSA) algorithm. RSA is a suite of cryptographic algorithms that are used for systems security purposes it secures huge amounts of sensitive data from national security to personal data within a firms systems and as it is being sent externally. Shors Algorithm is not capable of running on classical computers: it requires quantum computing to be effective.

Quantum computing is not a pipe dream: there are myriad firms working on developing it; and there are firms which do produce hardware with limited quantum computing capability at the moment (which works alongside classical computers). It may be decade before quantum computing becomes a reality (and many more years before it is commoditised), however, when it does, it will change the way in which we all need to secure our data. The security of both previous and future communications/storage will be at risk (or non-existent). In 2020, the UKs National Cyber Security Centre published a white paper Preparing for Quantum-Safe Cryptography. In its conclusions, it stated that there is unlikely to be a single quantum-safe algorithm suitable for all applications. In 2021, the NCSC announced its first quantum-safe algorithm. In 2022, the U.S. Department of Commerces National Institute of Standards and Technology (NIST) announced its first four quantum-resistant cryptographic algorithms.

The Digital Regulation Cooperation Forum bringing together four leading regulators in the UK published its Quantum Technologies Insights Paper earlier this year (June 2023). The paper considers the potential of quantum computing and the issues that need to be considered now as in now to prepare the world for this next big chapter in computing technology.

There are a few things to note:

The author of the FT article ended with a limerick written by Shor himself. We will end with an idiom. In binary.

01101001 01101110 00100000 01110100 01101000 01100101 00100000 01110111 01101111 01110010 01100100 01110011 00100000 01101111 01100110 00100000 01010011 01100101 01110010 01100111 01100101 01100001 01101110 01110100 00100000 01000101 01110011 01110100 01100101 01110010 01101000 01100001 01110101 01110011 00111010 00100000 01100010 01100101 00100000 01100011 01100001 01110010 01100101 01100110 01110101 01101100 00100000 01101111 01110101 01110100 00100000 01110100 01101000 01100101 01110010 01100101 00101110

Excerpt from:
Tackling the challenges of quantum computing seriously - Shoosmiths