Archive for the ‘Quantum Computer’ Category

The 3 Quantum Computing Stocks You Need to Own – InvestorPlace

Finding the best quantum computing stocks to buy is critical because this is clearly the next big industry.

Quantum computers promise to bring the power of quantum mechanics to bear in solving our most vexing problems. They may be capable of processing more data, faster, than any classical computer.

If all that happens, then quantum computing stocks may bring generational wealth to their investors.

Quantum computers are unique in that they use qubits rather than classical bits. These qubits are fundamentally quantum mechanical, and do not have a defined value until measured. Qubits can be made out of a variety of materials, but as of now there is no consensus of how best to make a qubit within the industry. Whos qubit becomes the standard will have a big impact on who wins and loses in the quantum race.

Because of this, its important to understand the science as well as the finance of quantum computing companies. What type of qubit they plan to use, and how they plan to deploy it in order to overcome quantum noise and other issues, will be central to whether a company survives and prospers.

But for the companies that do prosper, they could be the founders of a new trillion dollar industry. With that said, here are some of the best quantum computing stocks to buy.

Source: Amin Van / Shutterstock.com

IonQ (NYSE:IONQ) plans to use small, modular quantum computers and network them together to solve big problems.

Their latest offering is the 32 qubit IonQ Forte, built using trapped ions as qubits. While 32 qubits seems low, IonQ hopes the modular design will allow multiple systems to work in parallel. That could help to overcome quantum noise, and allow limitless scaling to meet the needs of any user.

IonQ is also making headway in bringing its computers to the masses. Theyve partnered with Amazon to bring the 25 qubit IonQ Aria to Amazon Braket. Amazon Braket is an Amazon Web Service for quantum computing.

This means that programmers can now work on developing software and services on IonQ computers more easily than before. This will give IonQ a leg in in the race to become the standard for quantum ecosystems

IonQ is still a small company, so an investment here could yield very large gains. But it is still somewhat speculative relative to the competition.

In Q1 2023 it had just $4.2 million in revenue, and a net loss of $27 million. It does have $51 million in cash and $336 million in investments, so it still has plenty of runway to survive. But it will eventually need real revenue in order to justify its valuation.

IonQs partnership with Amazon is a good first step towards getting itself available to the public. If they can build off this success, theyll prove themselves one of the best quantum computing stocks of our generation.

Source: Sundry Photography / Shutterstock.com

Intel (NASDAQ:INTC) may seem an unusual bet for the quantum computing industry, but their recent moves make them an enticing one.

Theyve recently released a software development kit (SDK) for quantum computers, for starters. This SDK simulates how a quantum computer will act, and allows programs to write and debug programs for quantum computers.

Even though current quantum computers are rare and difficult to maintain, this lets software developers get a head start in developing applications.

Intel is building itself as a real competitor for developing the chips that will run future quantum computers. They are developing their capacity to produce quantum dots at the scale and purity necessary to be used as qubits.

Quantum dots are just one way companies are trying to make qubits, but if Intels process is successful it could become the standard. That would make Intel a big player in the future quantum chip industry.

Intels biggest strength is its background in computer chips.

One of their weaknesses is that there is not yet a standard for producing quantum qubits, and the quantum dots they are working on might not get used by other quantum computing companies.

Compounding this is the fact that their cash flow has suffered as the chip shortage has eased. In Q1 2023 they had revenue of $11.7 billion and a net loss of $2.8 billion.

But quantum chips could provide a path back to profit, and their quantum dots bet is one of the most promising paths forward for the industry.

Source: josefkubes / Shutterstock.com

Honeywell (NASDAQ:HON) may not be known for their quantum computers, but Honeywell Quantum Solutions has sneakily made itself a real player in the industry.

Like IonQ, Honeywell is using trapped ions to power its quantum systems. But Honeywell is also pushing the boundaries of science in ways that could truly solve the problem of quantum error correction.

The biggest unsolved problem in quantum computers isnt how to make a qubit, its how to make a qubit that holds data for any significant length of time.

The bits of classical computers are exceptionally stable, but the qubits of quantum computers are liable to lose their information content if they interact at all with outside particles.

Qubits losing their information introduce errors into the computer program. And so quantum error correction is necessary to keep quantum programs running smoothly.

Recently though, Honeywell achieved what could be a quantum leap in quantum error correction. Quantinuum, held jointly by Honeywell and Cambridge Quantum Computing (privately held), has created a topological quantum state which could be the key to solving quantum error correction.

Solving quantum error correction is essential for any quantum computer to achieve widespread adoption. Quantinuums achievement puts it at the forefront in the race to broadly commercialize quantum computers.

Honeywell is also a safe haven for investment even apart from their quantum computers. In Q1 of 2023 they had $8.9 billion in revenue and $1.4 billion in net income. Quantum computing was only a tiny part of that.

The advances from Quantinuum could change all that. And that could make Honeywell one of the best quantum computing stocks to buy.

On the date of publication, John Blankenhorn did held a long position in Intel. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.

John Blankenhorn is a neuroscientist at Emory University. He has significant experience in biochemistry, biotechnology and pharmaceutical research.

Read the original post:
The 3 Quantum Computing Stocks You Need to Own - InvestorPlace

Quantum sensors will start a revolution if we deploy them right – Nature.com

Quantum sensors exploit the fundamental properties of atoms and light to make measurements of the world. The quantum states of particles are extremely sensitive to the environment, which is a virtue for sensing, if problematic for making a quantum computer. Quantum sensors that use particles as probes can quantify acceleration, magnetic fields, rotation, gravity and the passage of time more precisely than can classical devices that are engineered or based on chemical or electrical signals. They can be used to make atomic clocks that are smaller and more accurate, cameras that can see through fog and around corners, and devices for mapping structures underground, among many other potential applications. They stand to transform a multitude of sectors, from energy, land use and transport to health care, finance and security. But their commercial promise needs to be appreciated more.

As researchers developing quantum sensors in the laboratory, we are keen to make governments and industry more aware of the possible benefits in particular, in improving the safety of national critical infrastructure that relies on sensors, such as air-traffic control systems and water utilities. However, we and others face hurdles in gaining attention and funding to adapt quantum sensors for use in real-world settings.

One challenge is that it is hard to predict exactly how and where emerging technologies will be adopted. The history of physics is full of serendipitous inventions. X-ray generators, for example, were the accidental by-product of experiments to see whether beams of electrons could pass through glass, yet they are now crucial to medicine and airport security. The inventor of the laser, Theodore Maiman, famously described that technology as a solution seeking a problem.

Another factor is that many people including business leaders think quantum technologies are devices of the future, not the present. Unlike quantum computers, which get a lot of press but might be decades away from offering wide commercial advantage, quantum sensors are already in use in the lab. A handful are in commercial use: atomic clocks, for example, measure the passage of time supremely accurately using high-frequency quantum transitions in atoms. Their accuracy maintains the synchronization of communication and energy networks, and digital radio stations. They are crucial for satellite navigation services such as GPS.

Light waves squeezed through slits in time

Even so, it took 20 years to move GPS receivers from being specialist devices used by the military, tech-savvy hikers and ships captains to providing navigation for smartphones and cars. Now, the quantum community needs to establish similar pathways for realizing the commercial benefit of other types of quantum sensor.

Quantum gravity sensors and quantum gas detectors flown on satellites could collect accurate data on levels of groundwater, carbon dioxide and methane to improve climate modelling. Quantum magnetic sensors can image peoples brain signals in real time1, and quantum gravimeters can monitor underground water levels and volcanic eruptions2. Combinations of quantum sensors that track gravity gradients, magnetic fields and inertial forces 1,000 times more precisely than can classical ones might, for instance, enable reliable navigation in places where satellite signals are jammed or cannot reach, such as remote sites, conflict zones or underwater.

Here we highlight five priorities for commercializing quantum sensors to get them adopted faster.

Innovators in industry are rarely excited by a lab result that simply proves a concept. They want to know that a device will work reliably for a specific application, and that it will benefit their businesss finances. Researchers need to ensure that any sensor bound for the market is robust and reliable, can be manufactured reproducibly and cost-effectively, and is compatible with other systems in use. In practice, this might mean redesigning many aspects of the technology. Each tweak brings fresh challenges.

For example, in our lab at the UK Quantum Technology Hub Sensors and Timing in Birmingham, we have developed a sensor that measures gradients in gravity. In two chambers 1 metre apart, lasers trap rubidium atoms from a vapour and cool them to a standstill3. More laser pulses create superpositions of quantum states and read these out for the rubidium atoms in each cloud. Software converts those signals into a gravity gradient measure. By using a single laser beam to manipulate the atoms, this quantum device is 1,000 times less sensitive to noise from vibrations than are conventional gravimeters, and is thus easier to deploy.

A diamond-based quantum sensor can measure magnetic fields at the atomic scale.Credit: David Kelly Crow/de Leon lab/Princeton University

The version we first demonstrated in the lab was the size of a small van, with tables full of optical components and racks of electronic systems and power supplies. It was built from bespoke parts and tuned by hand. Taking this device outside the lab, to sense subterranean tunnels through small changes in local gravity4, meant making all of the components more robust, smaller and cheaper, as well as improving their performance.

Our physicists and engineers had to find ways to control the laser beam under varying temperatures, contain it in a vacuum to avoid air turbulence and pulse the laser to reduce the impacts of stray magnetic fields. Work is ongoing to operate the device on a moving platform to ease deployment, to increase its sensitivity and bandwidth to speed up mapping, and to reduce its size to that of a backpack so it can be mounted on a drone to survey large areas.

Test quantum mechanics in space invest US$1 billion

One promising avenue for miniaturization is integrating quantum sensors in photonic microchips5,6. These rely on light (photons) rather than the electrons used in conventional microchips, and are fast and energy efficient. Similar technology is found in fibre-optic networks. Quantum sensors could be miniaturized using photonic chips and existing manufacturing processes for micro-electro-mechanical systems (MEMS), which are used in vehicle airbags. The benefit is that they are robust and cope with vibrations better than bulkier optical systems do.

The challenge is integrating all the elements into one system that includes lasers, modulators, waveguides and beam splitters, as well as components such as vapour cells. Further research and investment are needed into new materials, fabrication technologies, device packaging and procedures for testing and validation. Standardization of quantum sensor technologies as low-cost building blocks is also urgently required, mirroring the processes for fibre-optic communication and MEMS sensor technologies.

Researchers need to talk to business leaders to determine how quantum sensors can add value across a range of applications. For example, uses for a gravity sensor are not obvious; few people visualize their surroundings in terms of gravity or the density of materials. But after discussions with more than 100 companies, we concluded that gravity sensors would be excellent for illuminating unknowns in the ground, from the position of forgotten mineshafts to groundwater levels and the distribution of carbon in soils and magma flows. These can, in principle, be seen by classical gravimeters, but ground vibrations make the measurement time infeasibly long, typically 510 minutes for a single data point. With quantum gravity gradiometers, such data could be collected in seconds, opening up the potential for gravity cartography4. And thats just what we have focused on so far.

An optical clock in which strontium ions oscillate in response to laser light.Credit: Andrew Brookes, National Physical Laboratory/Science Photo Library

Money for applied research and multidisciplinary collaborations between academia and industry is needed to validate these ideas. In our case, the next step involves geophysics research using such gravimeters to improve understanding of how water flows and accumulates underground. This information could be used to refine flood models, for example. Civil-engineering research is also required on how best to detect leakage in water pipes using such sensors. Broader technological and economic considerations will determine how this approach can be used most effectively in water management.

Companies should start thinking about new business models, such as offering underground mapping services to farmers to help reduce water use in irrigation. Engaging in pilot projects would put businesses in a good position to capitalize on market disruption, rather than being caught out by it.

Any sensor must be plugged into a bigger system to reap its benefits. For example, an inertial sensor one that detects movement is relatively useless on its own. But when integrated with electronics, software and a display in a smartphone, it can provide health information on step counts and calories burnt by the user.

Similarly, quantum accelerometers and sensors of rotation, gravity and magnetic field can be combined into position, navigation and timing systems for subsea and underground use. For this application, quantum sensors offer reduced bias, better precision and more stability than do their classical counterparts, allowing navigation with metre-level accuracy without having to use global satellite systems such as GPS. This capability would enable exploration of resources on the seabed, as well as securing and maintaining pipelines, cables and foundations of offshore wind farms and oil rigs, for example.

However, it remains fiendishly challenging to integrate quantum sensors into a full-blown navigation system. Constructing an inertial measurement unit alone requires three accelerometers, one for each spatial dimension, and three rotation sensors, one for each rotational degree of freedom, arranged at perfect right angles and all linked with a clock. If fitted on a vehicle or submarine, such a navigation system would need to compensate for small changes in local gravity and other forces induced by Earths rotation. The whole thing would need to be calibrated, which is hard to achieve at the high level of precision needed.

Quantum computings reproducibility crisis: Majorana fermions

Gravity and magnetic sensors would be needed for mapping these fields along the trajectory of the vehicle, as well as a computer control system with specialist software. Databases of field readings would need to be developed for comparison against the recorded gravity and magnetic traces, to allow absolute position fixes to deal with unavoidable long-term drifts.

Researchers also need to consider in detail how quantum sensor systems might be linked to national and international infrastructure networks. For example, communication networks could be revolutionized with the next generation of quantum clocks, optical clocks, which could be 1,000-fold more precise than the time provided by current satellite navigation systems. This might enable new modes of ultra-fast broadband, for example, which squeeze more data packets into channel bandwidth and use less energy to transmit each bit of data. Similarly, quantum sensors capable of detecting hydrogen could speed up the energy transition from natural gas to hydrogen fuels, because they could detect leaks and safeguard infrastructure to enable secure roll-out of this potentially highly explosive fuel.

Whereas academic researchers can develop sensors with the right properties, industry needs to lead this systems integration stage. Existing academic funding streams are too small to support such collaborative research. Substantial long-term research and development contracts with industry are needed to make this happen. For instance, in the 2000s, funding from the US Defense Advanced Research Projects Agency helped to create the chip-scale atomic clock within a decade, through a dedicated development programme involving academia and industry.

Raw data from a sensor needs to be transformed into information that is useful for a specific task. For example, although a quantum magnetic-field sensor can detect tiny fields associated with patterns of neural activity in the brain, 3D visualizations of brain activity require an array of such sensors, and algorithms and graphical representations to display them in ways a physician can interpret.

Development of such systems is under way1 and could revolutionize understanding of brain conditions. Real-time mapping (scans 100 times a second, for example) and analysis of brain responses to visual or sensual stimuli, even while a person is moving, might replace current techniques for diagnosing brain disorders based on patient questionnaires. It could also allow physicians to assess the efficacy of drugs for brain conditions on an individual basis.

A researcher at German firm Q.ANT checks a quantum sensor intended for industrial use.Credit: Sebastian Gollnow/dpa via Alamy

Similarly, advanced analytics are needed to extract 3D underground images from gravity surveys, where it remains challenging to determine how deep sensed objects are. Banks of radars driven by quantum oscillators need to be networked to show detailed images instead of dots on a radar screen, such as would be needed to classify and distinguish drones from birds flying over a city. Big data techniques must be deployed to harvest all of this information, enabling the monitoring of tens of thousands of delivery drones in cities, for instance.

Perhaps the greatest data challenge in terms of time and effort is to create training data sets through trials. Researchers need to conduct large-scale medical trials to find biomarkers for brain conditions, collect data from networks of gravimeters to understand underground water and other assets, and source radar data through sensor networks across cities. We encourage governments to fund such programmes to seed future ventures.

Although many countries have begun coordinated efforts to develop the base level of quantum technologies, there is still a scattered approach to adoption and exploitation. With many groups working in isolation, tackling the research challenges we outline would take decades. To speed things up, a strategy for coordinating research projects on quantum sensors is needed.

At the research end of the pipeline, some nations, including Germany, Japan, the Netherlands, the United Kingdom and the United States, have set up hubs and large projects to align academic and national needs in quantum technologies by bundling expertise and providing portals for interactions with industry and other partners. Yet, generally, sensors are not getting the attention they deserve in national quantum tech initiatives, with a few exceptions, such as QuantumBW, an initiative by the German state of Baden Wrttemberg, which explicitly focuses on quantum sensing.

Underdog technologies gain ground in quantum-computing race

Governments need to introduce policies and regulation to support innovation in quantum sensors, with one focus being enhancements to the management and security of critical national infrastructure. For example, a 2020 presidential order requires US national aviation authorities to become independent of global navigation satellite systems timing by 2025. This would ensure air-traffic control systems keep working even if those systems fail by accident or through hostile intervention. It is still too early to determine the impact, but the order has set the boundary conditions for the emergence of business ideas related to timing technologies.

Similar approaches in communications, water-resource management and medicine might encourage the uptake of quantum sensors in those sectors to make them more resilient by having independent timing and navigation or more detailed data.

Initiatives are also needed to bring companies, from component manufacturers to system integrators, together with academics to help find business solutions, rather than simply come up with the technology and then quickly scale up production in the hope that there will be a market. One promising effort is the National Accelerator for Quantum Sensors in the United Kingdom. Launched in 2022 and still to be fully funded, this accelerator involves three corporate giants with a global reach (BAE Systems, BP and BT) and is committed to bringing in dozens more companies. Although initiatives in other countries target quantum technologies in general such as QED-C in the United States the UK programme is unique in that it focuses on sensors.

To conclude, a long-term, industry-led approach for quantum sensor innovation is urgently needed. The physics of quantum sensors can deliver the performance, but the question is: who will lead the world in delivering the benefits?

View original post here:
Quantum sensors will start a revolution if we deploy them right - Nature.com

Quantum Computing Inc. Receives Follow-On Subcontract Award to … – PR Newswire

LEESBURG, Va., May 23, 2023 /PRNewswire/ --Quantum Computing Inc. ("QCI" or the "Company") (NASDAQ: QUBT), a first-to-market full-stack photonic-based quantum computing and solutions company, today announces that it received a follow-on task order to its subcontract award announced on February 8, 2023, to support NASA in remote sensing and climate change monitoring. In addition to testing its proprietary quantum photonic system for remote sensing applications (QLiDAR), QCI will also be processing satellite images by utilizing its photonic-based reservoir computing technology. This initial testing engagement is expected to be completed during the second quarter of 2023.

Dr. William McGann, QCI Chief Technology Officer commented, "Sunlight interference (noise) is a huge issue in space-based LiDAR remote sensing. LiDAR measurements of the air, and the optically thin aerosols/clouds during daytime from space experience compromised signal integrity. As a result, it is very difficult, if at all possible, to make good daytime LiDAR measurements from space with adequate signal-to-noise-ratios. In this expanded project, we will explore reservoir photonic computing to remove sunlight noise in satellite LiDAR images, thereby enabling daytime operations of spaceborne LiDAR systems. Our current prototype systems have shown outstanding performance in both pattern prediction and recognition, demonstrating good potential for sunlight noise removal. Through this project, we hope to prove the concept and develop a roadmap for future large-scale deployment to help NASA and many other potential customers."

QCI, through its wholly owned subsidiary, QI Solutions, which focuses on federal government projects, will perform both the original quantum LiDAR work as well as applying photonic computing capability to process the LiDAR data. This will be accomplished under a subcontract from Science Systems Applications, Inc. (SSAI), a leading scientific, engineering and IT solutions provider. Under the expanded subcontract, QCI will run the data from the QLiDAR system through the photonic-based reservoir computer to improve the calculation of the level of water released from snowmelt. Upon successful completion of the task under the new subcontract, follow-on options include airborne testing and positioning these devices together with the photonic reservoir system to enhance the signal integrity of the satellite images to create a network for monitoring snow levels globally. This will promote a better understanding of climate changes and provide accurate data for industry and agriculture.

"This expanded contract is a significant opportunity for QCI to demonstrate and validate two distinct QCI technology offerings to the recognized preeminent global leader in space research and exploration," commented Sean Gabeler, President of Q1Solutions. "QCI's photonic LiDAR and reservoir photonic computing systems deliver new measurement and data processing capabilities with single-photon sensitivity, strong noise rejection, and high-ranging spatial resolution and image fidelity at great distances through challenging environments such as snow, ice and water, during night or day. QCI systems are built for easy, scalable, and versatile use with favorable size, weight, power, and cost combined with increased connectivity and capacity, decreased training bias, and strengthened security."

For additional information on the company's suite of solutions, please visit our websiteor contact our team directly.

About Quantum Computing Inc. (QCI)

Quantum Computing Inc. is a full-stack quantum hardware and software company on a mission to accelerate the value of quantum computing for real-world business solutions, delivering the future of quantum computing, today. The company delivers accessible and affordable full-stack solutions with real-world industrial applications, using photonic-based quantum entropy, which can be used anywhere and with little to no training, operates at normal room temperatures and low power. QCI is competitively advantaged delivering its quantum solutions at greater speed, accuracy, and security at less cost QCI's core entropy computing capability, the Dirac series, delivers solutions for both binary and integer-based optimization problems using over 11,000 qubits for binary problems and over 1000 (n=64) qudits for integer-based problems, each of which are the highest number of variables and problem size available in quantum computing today.Using the Company's core quantum methodologies, QCI has also developed specific quantum applications for AI, cybersecurity and remote sensing, including its Reservoir Quantum Computing, reprogrammable and non-repeatable Quantum Random Number Generator and LiDAR products. For more information about QCI, visit http://www.quantumcomputinginc.com.

About QI Solutions, Inc. (QIS)

QI Solutions, Inc., a wholly owned subsidiary of Quantum Computing Inc., is a supplier of quantum technology solutions and services to the government and defense industries. With a team of qualified and cleared staff, QIS delivers a range of solutions from entropy quantum computing to quantum communications and sensing, backed by expertise in logistics, manufacturing, R&D and training. The company is exclusively focused on delivering tailored solutions for partners in various government departments and agencies. For more information about QIS, visit https://qiwerx.com/.

Important Cautions Regarding Forward-Looking Statements

This press release contains forward-looking statements as defined within Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. By their nature, forward-looking statements and forecasts involve risks and uncertainties because they relate to events and depend on circumstances that will occur in the near future. Those statements include statements regarding the intent, belief or current expectations of Quantum Computing Inc. (the "Company"), and members of its management as well as the assumptions on which such statements are based. Prospective investors are cautioned that any such forward-looking statements are not guarantees of future performance and involve risks and uncertainties, and that actual results may differ materially from those contemplated by such forward-looking statements.

The Company undertakes no obligation to update or revise forward-looking statements to reflect changed conditions. Statements in this press release that are not descriptions of historical facts are forward-looking statements relating to future events, and as such all forward-looking statements are made pursuant to the Securities Litigation Reform Act of 1995. Statements may contain certain forward-looking statements pertaining to future anticipated or projected plans, performance and developments, as well as other statements relating to future operations and results. Any statements in this press release that are not statements of historical fact may be considered to be forward-looking statements. Words such as "may," "will," "expect," "believe," "anticipate," "estimate," "intends," "goal," "objective," "seek," "attempt," "aim to," or variations of these or similar words, identify forward-looking statements. These risks and uncertainties include, but are not limited to, those described in Item 1A in the Company's Annual Report on Form 10-K, which is expressly incorporated herein by reference, and other factors as may periodically be described in the Company's filings with the SEC.

SOURCE Quantum Computing Inc.

View original post here:
Quantum Computing Inc. Receives Follow-On Subcontract Award to ... - PR Newswire

How Quantum Computing Is Already Changing the World – InvestorPlace

[Editors note: How Quantum Computing Is Already Changing the World was previously published in December 2022. It has since been updated to include the most relevant information available.]

Im a history junkie. So, in this special Sunday issue of Hypergrowth Investing, let me share an interesting story that I bet a lot of you have never heard before. And interestingly enough, it could be the key to helping you capitalize on the AI Revolution.

In attendance were scientists that, today, we praise as the brightest minds in the history of mankind.

Albert Einstein was there; so was Erwin Schrodinger, who devised the famous Schrodingers cat experiment, and Werner Heisenberg, the man behind the world-changing Heisenberg uncertainty principle and Louis de Broglie, Max Born, Niels Bohr, Max Planck.

The list goes on and on. Of the 29 scientists who met in Brussels in October 1927, 17 of them went on to win a Nobel Prize.

These are the minds that collectively created the scientific foundation upon which the modern world is built.

And yet, when they all descended upon Brussels nearly 94 years ago, they got stumped by one concept. Its one that, for nearly a century, has remained the elusive key to unlocking humankinds full potential.

And now, for the first time ever, that concept is turning into a disruptive reality through breakthrough technology that will change the world as we know it.

So what exactly were Einstein, Schrodinger, Heisenberg and the rest of those Nobel laureates talking about in Brussels back in 1927?

Quantum mechanics.

Ill start by saying that the underlying physics of this breakthrough quantum mechanics is highly complex. It would likely require over 500 pages to fully understand.

But, alas, heres my best job at making a Cliffs Notes version in 500 words instead.

For centuries, scientists have developed, tested, and validated the laws of the physical world, known as classical mechanics. These scientifically explain how and why things work, where they come from, so on and so forth.

But in 1897, J.J. Thomson discovered the electron. And he unveiled a new, subatomic world of super-small things that didnt obey the laws of classical mechanics at all. Instead, they obeyed their own set of rules, which have since become known as quantum mechanics.

The rules of quantum mechanics differ from that of classical mechanics in two very weird, almost-magical ways.

First, in classical mechanics, objects are in one place at one time. You are either at the store or at home, not both.

But in quantum mechanics, subatomic particles can theoretically exist in multiple places at once before theyre observed. A single subatomic particle can exist in point A and point B at the same time until we observe it. And at that point, it only exists at either point A or point B.

So, the true location of a subatomic particle is some combination of all its possible positions.

This is calledquantumsuperposition.

Second, in classical mechanics, objects can only work with things that are also real. You cant use an imaginary friend to help move the couch. You need a real friend instead.

But in quantum mechanics, all of those probabilistic states of subatomic particles are not independent. Theyre entangled. That is, if we know something about the probabilistic positioning of one subatomic particle, then we know something about the probabilistic positioning of another subatomic particle meaning that these already super-complex particles can actually work together to create a super-complex ecosystem.

This is called quantum entanglement.

So in short, subatomic particles can theoretically have multiple probabilistic states at once, and all those probabilistic states can work together again, all at once to accomplish their task.

And that, in a nutshell, is the scientific breakthrough that stumped Einstein back in the early 1900s.

It goes against everything classical mechanics had taught us about the world. It goes against common sense. But its true. Its real. And now, for the first time ever, we are leaning how to harness this unique phenomenon to change everything about everything

The study of quantum theory has led to huge advancements over the past century. Thats especially true over the past decade. Scientists at leading tech companies have started to figure out how to harness the power of quantum mechanics to make a new generation of superquantum computers.And theyre infinitely faster and more powerful than even todays fastest supercomputers.

Again, the physics behind quantum computers is highly complex, but heres my shortened version

Todays computers are built on top of the laws of classical mechanics. That is, they store information on what are calledbits, which can store data binarily as either 1 or 0.

But what if you could turn those classical bits into quantum bits qubits to leverage superpositioning to be both 1 and 0 stores at once?

Further, what if you could leverage entanglement and have all multi-state qubits work together to solve computationally taxing problems?

Theoretically, youd create a machine with so much computational power that it would make todays most advanced supercomputers seem ancient.

Thats exactly whats happening today.

Google has built a quantum computer that is about 158 million times faster than the worlds fastest supercomputer.

Thats not hyperbole. Thats a real number.

Imagine the possibilities if we could broadly create a new set of quantum computers 158 million times faster than even todays fastest computers

Wed finally have the level of artificial intelligence (AI) that you see in movies. Thats because the biggest limitation to AI today is the robustness of machine learning algorithms, which are constrained by supercomputing capacity. With quantum computing capacity, you get infinitely improved machine learning algos and infinitely smarter AI.

We could eradicate disease. We already have tools like gene editing. But the effectiveness of gene editing relies on the robustness of underlying computing capacity to identify, target, insert, cut and repair genes. Insert quantum computing capacity, and all that happens without an error in seconds allowing for us to truly fix anything about anyone.

We could finally have that million-mile EV. We can only improve batteries if we can test them. And we can only test them in the real world so much. Therefore, the key to unlocking a million-mile battery is through cellular simulation. And the quickness and effectiveness of cellular simulation rests upon the robustness of the underlying computing capacity. Make that capacity 158 million times bigger, and cellular simulation will happen 158 million times faster.

The applications here are truly endless.

But so are the risks

Most of todays cybersecurity systems are built on top of math-based cryptography. That is, they protect data through encryption that can only be cracked through solving a super-complex math problem. Today that works because classical computers cannot solve those super-complex math problems very quickly.

But quantum computing 158 million times faster than todays classical computers can solve those problems in the blink of an eye. Therefore, quantum computers threaten to make obsolete math-based cryptography as we know it. And this will compromise the bulk of the worlds modern cybersecurity systems.

Insiders call this the Quantum Threat. Its a huge deal. When it arrives, no digital data will be safe.

Back in 2019, computer scientists believed the Quantum Threat to be a distant threat something that may happen by 2035. However, since then, rapid advancements in quantum computing capability have moved up that timeline considerably. Today many experts believe the Quantum Threat will arrive in the 2025-to-2030 window.

That means the world needs to start investing in quantum-proof encryption today. And thats why, from an investment perspective, we believe quantum encryption stocks will be among the markets biggest winners in the 2020s.

The global information security market is tracking toward $300 billion. That entire market will inevitably have to shift toward quantum encryption by 2030. Therefore, were talking the creation of a $300-billion market to save the planet from a security meltdown.

And at the epicenter of this multi-hundred-billion-dollar, planet-saving megatrend is one tiny startup pioneering the most robust quantum encryption technology platform ever seen

This company is working with the U.S. and U.K. governments and various other defense and intelligence agencies to finalize its breakthrough technology. The firm plans to launch the quantum encryption system globally in 2023.

If the tech works at scale, this stock which is trading for less than $20 will roar higher by more than 10X by 2025.

Trust me. This is a stock pick you are not going to want to miss. It may be the single most promising investment opportunity Ive come across over the past year.

Gain access to that stock pick and a full portfolio of other potential 10X tech stock picks for the 2020s.

On the date of publication, Luke Lango did not have (either directly or indirectly) any positions in the securities mentioned in this article.

See the article here:
How Quantum Computing Is Already Changing the World - InvestorPlace

Optical computing: the power of light – TechHQ

Optical computers work through photonic transfer. They could be fast, with minimal heat loss during transfer. Theres controversy over the promises of photonic technology.

Optical computing is fast becoming a major player, especially in the realm of AI. Youd be forgiven for never having heard of it, but it involves lasers and light-speed, so why not find out more?

Optical computers, also known as photonic computers, perform digital computations using you guessed it photons. Light waves produced by lasers or incoherent sources are used as a primary means for carrying out numerical calculations, reasoning, artificial intelligence, data processing, data storage and data communications for computing.

Like any computer, an optical computer needs three things to function well:

The history of optical computing is interlinked with the development of radar systems. In the 1960s, the invention of the laser saw the first schemes for an all-optical computer proposed, and since the 1990s, the emphasis has shifted to optical interconnection of arrays of semiconductor smart pixels.

Traditional computers use electrons to carry out calculations, but photons have the capacity to enable a higher bandwidth; visible and infrared (IR) beams flow across one another without interacting, unlike electrons, so they can be constrained to what is effectively two-dimensional computing.

Three-dimensional wiring is necessary in traditional computers to direct electrical currents around one another. So, a photonic computer can be smaller than its more common counterpart. Like traditional computing, optical computers use logic gates and binary routines to perform calculations, but the way these calculations are performed differs.

Optical computing can achieve similarly efficient and reliable computation to the silicon channels and copper wires that enable electronic computers to function, by using plasmonic nanoparticles. Further, the absence of physical wires means that optical computers are less prone to damage from heat or vibrations.

Because photons can be easily manipulated and controlled, photonic computers are faster and more efficient. Photon movements can be guided and controlled in such a way that they can turn corners and carry on without a significant loss of power. Light can be easily contained and loses less information during travel, which is especially useful in situations where the interconnects might heat up, which slows electrons movement.

Photonics have a high throughput of >1TB/s per channel (of which there can be many in close proximity), compared to copper wires capability of 1GB/s per channel.

The hope is that the use of light or information shuttling will result in the development of exascale computers. Exascale computers could perform billions of calculations every second, 1000x faster than the current fastest systems.

So, we can weigh up the advantages and disadvantages of this alternative mode as follows:

Advantages of optical computing:

The disadvantages are:

There are disagreements among researchers when it comes to the capabilities of optical computers. Whether or not they can compete with semiconductor-based electronic computers in terms of speed, power consumption, cost, and size is an open question.

Critics argue that real-world logic systems require logic level restoration, cascadability, fan-out and input-output isolation, all of which are currently provided by electronic transistors at low cost, low power, and high speed. For optical logic to be competitive beyond niche applications, major breakthroughs in non-linear optical device technology would be required, or even a change in the nature of computing itself.

Another option would be creating a hybrid system that integrates optical solutions into digital computing. However, there are impediments to the use of optics in digital computing that perhaps demand a much more guarded view of the ability of optics to compete with digital electronics.

Digital computing requires nonlinear elements to process digital data. The required functionalities of nonlinear elements are all delivered by transistor circuits in electronic computing. For large scalable logic circuits, no optical element or circuit, active or passive, can do all that and also compete with transistors in the metrics of energy consumption and small device footprint.

In digital communications, fiber optic data transfer is already prevalent. Fiber optics use light for data manipulation. This is the area in which optical technology has advanced the most: its used enough that its already common in the lexicon of data transfer.

Fiber optic cables can contain a varying number of glass fibers, along which information is transmitted as light pulses. Fiber optic cables have advantages over copper cables, including higher bandwidth and transmit speeds. You might have noticed that these pros echo those of optical computing.

However, making the switch is much simpler when it comes to fiber optics cables, which are already used for internet, television and telephone connections.

Areas of active research aiming to overcome some of the current limitations of photonic computing include:

A spinout of MIT, Lightelligence is developing the next generation of computing hardware. Founded in 2017, the company claims to have transformed the cutting-edge technology of photonics into groundbreaking computing solutions, which not only bring exponential improvements in computing power, but also dramatically reduce energy consumption.

In basic terms, its research uses a silicon fabrication platform used for traditional semiconductor chips, but in a novel way. In the optical domain, arithmetic computations are done with physics instead of with logic gate transistors that require multiple clocks.

Yichen Shen, co-founder and CEO of Lightelligence, said that because the system its developing generates very little heat, it has a lower power consumption than electron-powered chips.

Were changing the fundamental way computing is done, and I think were doing it at the right time in history, says Shen. We believe optics is going to be the next computing platform, at least for linear operations like AI.

Yes like all of the tech world at the moment, optical computing has a vested interest in AI. However, instead of thinking about how artificial intelligence could help it, photonic computing might facilitate the further development of AI.

For example, self-driving vehicles rely on cameras and AI computations to make quick decisions. The conventional chip doesnt think fast enough to make the split-second decisions necessary, so faster computational imaging is needed for quick decision making. Thats what Lightelligence says its achieving using photonics.

We couldnt talk about radical changes to computational systems without touching on quantum computing. Due to the unique properties of quantum mechanics, quantum computing can solve problems beyond the capabilities of the most advanced computers, including photonic.

The area in which optical computing is ahead of quantum is the speed at which (simpler) calculations can be performed. In some cases, optical computing is faster than quantum. In many cases, optical computing is being researched for use in tandem with quantum computers. Both have the potential to revolutionize computation and data processing.

Weve yet to see an optical computer, but were at the frontier of developments. Since 2012, Moores law (that the number of transistors in an integrated circuit doubles every two years) has been defunct: AI compute doubles every 3.4 months. Weve come incredibly far, incredibly fast.

Photonic computers might be closer than we think.

Read more from the original source:
Optical computing: the power of light - TechHQ