Archive for the ‘Quantum Computer’ Category

Quantum Computing and IBM i – IT Jungle

March 24, 2021Alex Woodie

At first glance, IBM i servers and quantum computers appear to be worlds apart. But considering the rapid advance of quantum computing today and the midrange servers place in a long line of advances in business computing, they may not be as far removed as one might think.

Thats the general conclusion one could draw from listening to Jack Woehr and Jesse Gorzinski discussing the topic of quantum computing during Woehrs presentation, From Hamilton to Hollerith: Whats the Use of Quantum Computers? during last weeks IBM i Futures Conference, which was sponsored by COMMON.

In addition to writing IBM i code and working with open source software on the platform, Woehr, who previously was an editor at the now-defunct Dr. Dobbs Journal, is also active in the quantum computing community. That activity, plus his 40 years of experience as a programmer, gives him a unique perspective into how the future lines of quantum computing and IBM i may intersect.

In Woehrs view, quantum computing is something that younger developers should keep an eye on. The technology is not necessarily ready for mainstream adoption today, but its moving so quickly and showing such promise that ignoring it would be a mistake, he said.

Like so many other things, its where the world is going, and if you want to stay competitive, youre going to have to deal with this, Woehr said. And if youre young, [quantum computing is] going to be there before you retire.

The IBM Q System One.

Just as it took some time for organizations to accept that digital binary computers were the future and to give up their punch card systems back in the 1940s and 1950s, there will be a period of transition between todays digital binary computers and the quantum computers of tomorrow, Woehr predicted.

When digital binary computers first came in, they were attaching to punch card machines and saying, look what we can do? And theyd say, well, we can already do that. Why would you want to buy this expensive machine to do that? Woehr said. Well, we know the answer to that now. But it wasnt as obvious from 1946 to 1953 as it is now.

Its hard to overstate the changes that modern computers have had on our lives. Many aspects of how we work and play have been digitized, and the digitization has increased during COVID-19. The most valuable companies in the world are technology companies (although some would call them data companies).

We have built all this technology on a platform of digital binary computing, which has Boolean algebra as its foundation. Everything were doing today electronically is these three operators, and, or, and not, which is all that digital binary computers actually do, Woehr said. Its had this tremendous effect on our world. But this was again not obvious to the people who would become very adept at operating the paper punch card machines.

Quantum computing promises to fundamentally transform how we calculate, how we program, and how we develop applications. Instead of two bits and three basic operators, quantum computing brings a much more capable mathematical underpinning that will unlock new capabilities, Woehr said.

Quantum computing is multi-dimensional compared to [digital binary computers] because its not based on Boolean algebra, Woehr said. Its based on linear algebra matrices multiplied [by] vectors, and the matrices and vectors are matrices and vectors of complex numbers.

In digital binary computing, only amplitude factors in, giving us ones and zeros. Well, quantum computing makes up amplitude and phase, for a start, and theres a lot of other things that are different about them. Theyre digital binary bits, but its more multi-dimensional than the way we compute now. And its likely to transform our world in ways that we cannot imagine.

The miracle of digital binary computers allows IBM i developer Jack Woehr to appear to be in Hawaii (he actually lives in Colorado).

IBM, Google, and Microsoft arguably are the leaders in developing quantum computers today, but there are a lot of other companies from around the world making a play, with a variety of designs, some of which will go further than others. Its hard to tell who the leaders will be in the near future because the field is so new and moving so quickly, Woehr said. Were in a caucus race with quantum computing, he said, referencing the tumultuous footrace that took place in Alice in Wonderland.

In fact, there is a technical term for the raucous quantum din: Noisy Intermediate-Stage Quantum, or NISQ. What that means is, it sort of works, but its hard to get the right answer from it. You have to really look at what its saying, Woehr said.

One of the problems is that quantum computers are not very good at holding onto their state. Getting materials into the quantum state, and keeping them there, is currently a work in progress. That presents a problem when trying to get quantum computers to do useful work, such as solving an optimization problem (which is one class of applications that quantum computers excel at).

There is plenty of work to do in quantum computing, and that work is moving extremely fast. Its unclear exactly when some of these problems are going to be solved, and when quantum computers will be practical for adoption by businesses. But theres one thing that nobody doubts any more: whether quantum computing actually works.

But that wasnt the case 11 years ago.

There was some doubt in 2010 if this was real or not, Woehr said. Even in scientific circles there were doubts whether this was real. But it does work and we know it works now.

The main benefit is a time advantage, he said. With its richer space of operators and states, quantum computers solve some problems significantly faster than traditional binary computers.

Optimization problems are one of the most promising areas for quantum computers, Woehr said. For example, some companies put a lot of time into calculating how much ore theyre likely to remove from a mine. They consider the placement of ore in the mine, along with variables like fuel and labor costs, the weather, and market prices.

You have these huge, huge optimization problems that have many, many variable and they put them on supercomputers and run for weeks, Woehr said. Quantum computing happens to be very good at optimization.

This is where the futures of quantum computing and IBM i may intersect. Today, developers are programming quantum computers using open source frameworks like Qiskit (pronounced kiss-kit), which is a project that IBM is behind. Woehr and Gorzinski are both active in the Qiskit community. During their chat, Woehr demonstrated how an optimization problem could be solved on the IBM Q computer using Grovers Algorithm developed in Qiskit.

The problem they were solving00the optimal combination of ingredients to brew a batch of beer could be extended to many industries and use cases. Grovers can be used to solve the types of application problems that IBM i folks are familiar with, Woehr said.

Suddenly everyone here whos listening in will realize this could be any problem, he said. Grovers will be any kind of problem any kind of problem where we have multiple variables and some combination of their state is a valid solution and some combinations are not.

Gorzinski agreed. I think it comes back to this notion that theres a tidal wave coming but there are real applied use case that are out there for a lot of industries, especially the industries that the IBM listeners today are probably part of, he said. Its a competitive scenario in the future. People are going to want to adopt this technology, in my opinion.

Quantum computing may not be mainstream in 10 or even 20 years. But the pace of advances in the field is quickening, Woehr says, and folks who are starting their computing careers today would be wise to keep an eye on how the field develops.

The reason to look at it now is just to orient yourself, he says. It may be a while before you see it in your organization. But its coming. If you are in your 20s now its certainly going to be there before youre my age, before youre ready to retire.

Long-Term Impacts of COVID-19 Predicted for IBM i Shops

Mad Dog 21/21: Qubits Rubes

As I See It: Sub-Atomic Dreams

More:
Quantum Computing and IBM i - IT Jungle

Japan’s first leading-edge quantum computer to be installed this year – The Mainichi – The Mainichi

This photo shows IBM Corp.'s quantum computer that will be installed at Kawasaki Business Incubation Center in Kawasaki, Kanagawa Prefecture. (Photo courtesy of IBM Japan Ltd.)

TOKYO -- Japan will be getting its first leading-edge quantum computer this year.

IBM Japan Ltd. announced on March 23 that the computer made by its U.S. parent IBM Corp. will be installed at Kawasaki Business Incubation Center (KBIC) in the city of Kawasaki, Kanagawa Prefecture, just south of Tokyo. It will be in place within a few months, and will be in operation by the end of the year. The University of Tokyo, which holds exclusive access rights, will seek to put the machine to practical tasks in cooperation with companies through a dedicated consortium.

Quantum computers use quanta -- such as light -- which have the characteristics of both waves and particles, and can make multiple calculations simultaneously using a completely different process from conventional computers. It is expected to be used for purposes including developing new drugs and materials, and managing assets. Japan's first machine will be a "gate-model quantum computer," which theoretically has very broad applications. IBM and Google LLC are both developing this type of computer.

The University of Tokyo signed a partnership with IBM Japan in December 2019, and established the Quantum Innovation Initiative Consortium in July 2020 to turn quantum computers to practical use through the cooperation of government, industry and academia. The two universities and 12 companies that make up the consortium include Keio University, Toshiba Corp., Mitsubishi Chemical Holding Corp. and Mitsubishi UFJ Financial Group Inc. The consortium members will be able to access the quantum computer in Kawasaki through cloud technology.

IBM Corp. currently has more than 30 quantum computers in New York, and at least 140 companies and universities around the world access them through cloud technology. Many members of the Japanese consortium have also used the New York machines, but they are forced to compete for time on the systems with people around the world, limiting access periods. When the quantum computer has been installed in Japan, the consortium members will be able to use it for their research for longer stretches.

Hiroaki Aihara, the consortium's project leader and vice president of the University of Tokyo, said, "It's overwhelmingly advantageous to be able to get a lot of time on a cutting-edge computer. We want to develop quantum computer apps through industry-academia cooperation and accelerate the technology's use." Outside Japan, another quantum computer is set to enter operation in Germany in 2021.

KBIC is a research and development office space equipped with labs for start-ups. IBM Japan also uses the facility as a research center.

(Japanese original by Mayumi Nobuta, Science & Environment News Department)

Read the original post:
Japan's first leading-edge quantum computer to be installed this year - The Mainichi - The Mainichi

Quantum Computing- The UK and Europe play catch-up with the USA and China. – Electropages

The my Quantum computer is bigger than yours game has played out for many years, and the leading contenders in the Qubits superiority race are the USA and China.

Now Europe wants to get a seat at the big Quantum table, and there are EU consortiums and British led partnerships aiming to not only develop a hyper-fast computer but crucially, one that has many practical applications commercially.

So what are they up against? Well, the machine to beat at present is the Chinese computer called Jiuzhang, which the Chinese claim is just a mere 10billion times faster than Googles current offering. China says this gives them Quantum supremacy, but then they would because thats exactly the term used by Google to describe its Quantum offering.

Is there a difference between the Chinese machine and Googles? Yes, there is. Jiuzhang makes its calculations using optical circuits, whereas Google's uses Sycamore, which is superconducting materials on a chip, a design that resembles classical computers.

But, in the technological chest-thumping world of Quantum computing, there is just one boast that everyone wants to make, and that is, mines the fastest.

In the need-for-speed, Chinas Jiuzhang computer is claimed to be 100 trillion times faster than supercomputers. This means in seconds. It can do what normal computers would take millions of years to achieve. These figures are impressive, but a word of caution here does depend on what test the Quantum computer was given to perform as different tests can produce different computational speed results.

Nevertheless, the speed of true Quantum computing is mind-boggling, to say the least, and the real question is how these speeds are achieved? Qubits are how.

Normal computers can only calculate using bits that have only two working states that of 0 or 1. Quantum machines have bits (Qubits) that can provide numerous different states simultaneously. This is what gives them a tremendous speed boost. Get a load of these Qubits in a synchronised linkage, and they can calculate in seconds what would take a conventional computer millions of years.

Qubits represent atoms, ions, photons or electrons and give Quantum computers their inherent parallelism. This means that whereas a conventional computer will work on a single calculation, a Quantum computer can simultaneously work on millions.

But its not just all about the speed. Quantum computing falls in a big way in three areas, and these are, firstly, exactly what tests were made to achieve certain speed results. Secondly, are Quantum computers reliable and, thirdly, what practical applications can they handle that makes them a commercially viable proposition?

The point about speed tests is that not all speed tests are created equal. Quantum computers have to be set up to perform a specific function. To test Jiuzhang, the computer had to calculate the output of a complex circuit that used light. It detected an average of 40 outputs, and its time to do that was a mere three minutes, whereas one of the worlds fastest supercomputers would have taken two billion years to reach the same conclusion. But this was a specially-tailored test and didnt necessarily have relevance to broader applications in the commercial world.

Googles Sycamore testing also came into scrutiny from rival IBM, and again the discussion came down to how relevant was the testing in terms of real-world practicality.

So given these out-of-this-world performance figures, it makes Hitch Hikers Guide to the Galaxys supercomputer Deep Thought look pretty pedestrian. It took Deep Thought a pedestrian 7.5 million years to decide the answer to the question of life, the universe and everything was 42.

Another operational shortfall with Quantum computing is reliability. By their very nature, Qubits are not durable and can easily be upset and need to be in a perfect, temperature-controlled environment that is totally free of vibrations and ambient atomic structures. This, of course, can be created to keep the Qubits bits happy. Still, the length of time they will operate efficiently and accurately is minimal before they technically slow down and abdicate their Quantum coherence.

So while we are all astonished at examples of their computational speeds, Quantum computers are not anywhere near becoming a commercially viable proposition.

Enter the first European consortium that has ambitions to change all that. Its snappily titled the German Quantum Computer based on Superconducting Qubits (GeQCoS) group. Munich chip-maker Infineon and scientists from five research institutes in Germany aim to drive forward the development and industrialisation of Quantum computing.

According to Infineon, Quantum computers have the potential to replace existing conventional computers in specific applications. They could, for example, calculate simulations of complex molecules for the chemical and pharmaceutical industry, complicated optimisations for the automotive and aviation industry, or new findings from the analysis of complex financial data.

The project is funded by the German Ministry of Education and Research and hopes to create a Quantum processor based on superconducting Qubits and demonstrate its special capabilities on a prototype within four years. Working together to achieve this are scientists at the Walther Meisner Institute of the Bavarian Academy of Sciences and Humanities and the Technical University of Munich, the Karlsruhe Institute of Technology, the Friedrich Alexander University of Erlangen-Nuremberg, the Forschungszentrum Jlich and the Fraunhofer Institute for Applied Solid State Physics and Infineon.

If we in Germany and Europe dont want to be dependent for this future technology solely on American or Asian know-how, we must move forward with the industrialisation now, explained Sebastian Luber, senior director of technology & innovation at Infineon.

Naturally, Germany is not alone in its bid to gain Quantum supremacy. The VTT Technical Research Centre of Finland is also part of a consortium seeking a Quantum technology lead.

It correctly believes superconducting processors could become a key ingredient for creating the next generation of supercomputers. Firstly, they could help tackle the major challenge of scaling up Quantum computers and secondly, they could speed up traditional supercomputers and drastically cut their power consumption.

A multidisciplinary research project led by VTT will tackle one of the main technical challenges to achieve this, the data transfer to and from low temperatures required for superconductivity.

The VTT consortium consists of Tampere University in Finland, KTH Royal Institute of Technology in Sweden, ETH Zrich in Switzerland and PTB, the national metrology institute of Germany, and corporate partners Single Quantum in the Netherlands and Polariton Technologies in Switzerland. It is a three-year project.

We know that a Quantum computer's processing power is based on superconducting Qubits operating at extremely low temperatures, and Qubits are typically controlled by conventional electronics at room temperature and connected through electrical cables. However, when the number of Qubits eventually rises to the required level of hundreds of thousands, the number of control cables to match the number of Qubits will generate an extreme heat-load that considerably inhibits Quantum's speed processors.

One solution is to control the Quantum processor with a nearby classical processor. A promising solution is to use the single flux Quantum (SFQ) technology which emulates traditional computers in logic but uses superconducting technology instead of conventional semiconductors. Because it requires low operational temperatures, SFQ has rarely been used in traditional computers. This disadvantage, however, turns into an advantage when used in combination with superconducting Quantum computers.

But a major challenge remains. Calculation instructions come to the SFQ processor from a conventional supercomputer, and calculation results must be sent back from the SFQ processor to the same machine. This requires data transfer between extremely low temperatures and room temperatures which doesnt suit conventional semiconductors.

The VTT projects vision is to replace electrical cables with optical fibres and suitable converters which convert optical signals to electrical signals and vice versa. Unlike existing solutions, these components must be able to operate at low temperatures. This will require the development of innovative converters that can drive and read out a simple SFQ processor.

Besides Quantum computers, conventional supercomputers could benefit from the development of optical connections for SFQ technology. A major limitation of supercomputers is the extremely high-power consumption of CPUs and GPUs due to the silicon chips' energy dissipation. Replacing silicon chips with superconducting SFQ chips in GPUs could have a notable impact on supercomputers' performance and power consumption.

Here in the United Kingdom, Oxford Instruments Nanoscience announces significant innovation in its Cryofree dilution refrigerator technology. It believes the advancement of its ProteoxLX, a dilution refrigerator, will take the research into Quantum computing to the next level, enabling its commercialisation globally.

Since the launch of Proteox at APS Physics last year, Oxford Instruments has announced its partnership with the University of Glasgow and Rigetti and Oxford Quantum Circuits. Oxford Instruments NanoScience has also secured significant wins outside of Europe, more recently with Proteox selected by SpinQ Technology in China.

NanoScience is committed to driving leadership and innovation to support the development and commercialisation of Quantum computing around the world, explained Stuart Woods, managing director of Oxford Instruments NanoScience.

The ProteoxLX can maximise Qubit counts with large sample space and ample coaxial wiring capacity, low vibration features for reduced noise and support of long Qubit coherence times and full integration of signal conditioning components.

The LX also provides two fully customisable secondary Inserts for an optimised layout of cold electronics and high-capacity input and output lines, fully compatible and interchangeable across the Proteox family. Finally, the ProteoxLX offers 25 W cooling power available at 20 mK, low base temperature at < 7 mK, and twin pulse tubes providing up to 4.0 W cooling power at 4 K.

All these UK and EU corporate and academic consortium driven projects to advance Quantum computing should give the US and Chinese technologists some challenges relative to who stays ahead in the race to develop a commercially viable machine. Still, I dont expect either the US or China will be resting on their Qubit laurels.

More here:
Quantum Computing- The UK and Europe play catch-up with the USA and China. - Electropages

Global Artificial Intelligence in Military Market (2020 to 2025) – Incorporation of Quantum Computing in AI Presents Opportunities -…

DUBLIN--(BUSINESS WIRE)--The "Artificial Intelligence in Military Market by Offering (Software, Hardware, Services), Technology (Machine Learning, Computer vision), Application, Installation Type, Platform, Region - Global Forecast to 2025" report has been added to ResearchAndMarkets.com's offering.

The Artificial Intelligence in military market is estimated at USD 6.3 billion in 2020 and is projected to reach USD 11.6 billion by 2025, at a CAGR of 13.1% during the forecast period.

The Artificial Intelligence in Military market includes major players such as BAE Systems Plc. (UK), Northrop Grumman Corporation (US), Raytheon Technologies Corporation (US), Lockheed Martin Corporation (US), Thales Group (US), L3Harris Technologies, Inc. (US), Rafael Advanced defense Systems (Israel), and IBM (US), among others. These players have spread their business across various countries includes North America, Europe, Asia Pacific, Middle East & Africa, and Latin America. COVID-19 has not affected the Ai in military market growth to some extent, and this varies from country to country. Industry experts believe that the pandemic has not affected the demand for Artificial Intelligence in the military market in defense applications.

Based on platform, the space segment of the Artificial Intelligence in military market is projected to grow at the highest CAGR during the forecast period

Based on platform, the space segment of the Artificial Intelligence in military market is projected to grow at the highest CAGR during the forecast period. The space AI segment comprises CubeSat and satellites. Artificial intelligence systems for space platforms include various satellite subsystems that form the backbone of different communication systems. The integration of AI with space platforms facilitates effective communication between spacecraft and ground stations.

Software segment of the Artificial Intelligence in Military market by offering is projected to witness the highest CAGR during the forecast period

Based on offering, the Software segment is projected to witness the highest CAGR during the forecast period. Technological advances in the field of AI have resulted in the development of advanced AI software and related software development kits. AI software incorporated in computer systems is responsible for carrying out complex operations. It synthesizes the data received from hardware systems and processes it in an AI system to generate an intelligent response. The software segment is projected to witness the highest CAGR owing to the significance of AI software in strengthening the IT framework to prevent incidents of a security breach.

The North American market is projected to contribute the largest share from 2020 to 2025 in the Artificial Intelligence in Military market

The US and Canada are key countries considered for market analysis in the North American region. This region is expected to lead the market from 2020 to 2025, owing to increased investments in AI technologies by countries in this region. This market is led by the US, which is increasingly investing in AI systems to maintain its combat superiority and overcome the risk of potential threats on computer networks. The US plans to increase its spending on AI in the military to gain a competitive edge over other countries.

The North American US is recognized as one of the key manufacturers, exporters, and users of AI systems worldwide and is known to have the strongest AI capabilities. Key manufacturers of Ai systems in the US include Lockheed Martin, Northrop Grumman, L3Harris Technologies, Inc., and Raytheon. The new defense strategy of the US indicates an increase in AI spending to include advanced capabilities in existing defense systems of the US Army to counter incoming threats.

Market Dynamics

Drivers

Restraints

Opportunities

Challenges

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/acjap9

Read more from the original source:
Global Artificial Intelligence in Military Market (2020 to 2025) - Incorporation of Quantum Computing in AI Presents Opportunities -...

The Quantum Age Will Require a Quantum Generation – Fair Observer

Bartlomiej K. Wroblewski / Shutterstock

Past the glow of the Shanghai evening, a single red beam threads its way into the silent stratosphere. It is a laser originating from a laboratory whose machinery few can operate or explain. The laser is meant to bounce off a distant satellite before returning for the purpose of encrypting an otherwise earthly conversation in a manner as secure as it (once) was impossible.

Chinas pursuit of quantum technologies, quantum supremacy and a leadership stake in the much-heralded quantum future awaiting us is as well documented as the United States similar quest. Additionally, opinions concerning quantum computings significance to global security, business and geopolitics range from comprehensive analyses of the industrys experts to the musings of its pluckiest amateurs. Just as bountiful are the resources both formally structured and open-sourced available to anyone interested in the technical functionality, universal physical properties, revolutionary new bits or pioneering logic gates powering such complex, world-changing machinery.

READ MORE

So, rather than using this space for another overloaded elucidation of quantum computings principles, our focus must pivot to the need for, and the already encouraging progress toward, educating the next generation of computer scientists, developers and engineers in what any of these words and concepts mean, what quantum computing is and, just as importantly, what it can be.

What quantum computing can be is the most significant technological, economic and governmental functionality in human history. It will empower its masters to blow away the capabilities available via traditional computers to solve problems in seconds that would take todays machines years to complete. What it will also be for some time is expensive, uncertain and a bit scary. But this is all the more evidence of our need to understand its most fruitful applications as well as its limitations, whatever those might be.

What quantum will be is what the next generation of students the most technologically-skilled cohort ever assembled refers to not as quantum, but simply as computing.

While the hype over quantum computings transformational capabilities across sectors, industries and regions has been building for decades, too many American public policy proposals in the quantum realm have begun and ended with public investment in hardware, sparing little attention or resources for the education of the next generation of engineers who for any national quantum program or policy to succeed must be equipped to use it. Some encouraging developments, however, indicate that the importance of quantum education and a quantum-skilled workforce may finally be taking root. There are several entities leading the charge to identify quantum education as a critical need, as collections of the right leaders in the right rooms (virtual or otherwise) are currently conducting the first wave of conversations necessary to educate the workforce the quantum age will require.

The first such institution is the US Army. Placing a renewed emphasis on the development of its people, and the attraction of top industry talent to roles of public service, the US Army has led the way from a federal standpoint in committing to the modernization of its workforce for the quantum age. Though encouraging, it is important to note that such an undertaking must be generational in its scope and investment to be successful, as the biggest organizations, like the biggest ships, change direction most slowly.

The national security implications of quantum computers are a likely driving force behind the Armys design of its Quantum Leap initiative. With that said, it is encouraging that given those concerns, the Army has responded with a people-first focus on developing, attracting and retaining the kind of talent necessary to steward the weaponry of the future capably and responsibly.

A layer beneath the modernization of federal agencies sits the collaborative approach of the US National Science Foundation, the White House Office of Science and Technology Policy, and a smattering of the countrys largest technology firms referred to as the National Q-12 Education Partnership. The appeal of such a public-private endeavor is clear and mutual, as both Americas public and private entities have a stake in seeing the next generation of quantum leaders developed in the US.

Such partnerships should set ambitious goals for themselves and inclusively embrace the full breadth of talent waiting for them within a generation that is as unprecedentedly tech-savvy as it is diverse. Quantum must be more than yet another driver of inequality. Its transformational potential is too great to hoard in Palo Alto or Cambridge. As such, partnerships like these must emphasize the inclusion of institutions like Americas historically black colleges and universities which have done tremendous work to close achievement gaps in STEM fields to tap their talents as indispensable leaders of this historic educational effort.

Finally, local initiatives to educate the next generation of quantum engineers mark perhaps the most American solution to this challenge of all. University leaders should take a lesson from their counterparts at UC Santa Barbara who are partnering with local school districts to tailor quantum educational programming to students of all ages and ability levels. While federal support for such programming is surely welcome, universities and K-12 institutions need not wait for Washington to start training and identifying the future leaders of a quantum age approaching as fast as the photons flying over Shanghai.

Quantum computers, like traditional computers, televisions, toasters, phones and radios, will be neither good nor bad. But they will be here, available for common personal and business use, soon. Education in their design, functionality and best uses will allow for the formulation of informed, forward-looking, strategic, quantum computing governance policies rooted outside of the binary choice between ignorant cynicism and naive optimism. Rooted, that is, in the messy nuances of reality and the goal of every stubborn innovator to not just build the thing right, but to build the right thing.

The views expressed in this article are the authors own and do not necessarily reflect Fair Observers editorial policy.

Here is the original post:
The Quantum Age Will Require a Quantum Generation - Fair Observer