Archive for the ‘Quantum Computer’ Category

Why AI Geniuses Haven’t Created True Thinking Machines – Walter Bradley Center for Natural and Artificial Intelligence

As we saw yesterday, artificial intelligence (AI) has enjoyed a a string of unbroken successes against humans. But these are successes in games where the map is the territory. Therefore, everything is computable.

That fact hints at the problem tech philosopher and futurist George Gilder raises in Gaming AI (free download here). Whether all human activities can be treated that way successfully is an entirely different question. As Gilder puts it, AI is a system built on the foundations of computer logic, and when Silicon Valleys AI theorists push the logic of their case to a singularity, they defy the most crucial findings of twentieth-century mathematics and computer science.

Here is one of the crucial findings they defy (or ignore): Philosopher Charles Sanders Peirce (18391914) pointed out that, generally, mental activity comes in threes, not twos (so he called it triadic). For example, you see a row of eggs in a carton and think 12. You connect the objects (eggs) with a symbol, 12.

In Peirces terms, you are the interpretant, the one for whom the symbol 12 means something. But eggs are not 12. 12 is not eggs. Your interpretation is the third factor that makes 12 mean something with respect to the eggs.

Gilder reminds us that, in such a case, the map is not the territory (p. 37) Just as 12 is not the eggs, a map of California is not California. To mean anything at all, the map must be read by an interpreter. AI supremacy assumes that the machines map can somehow be big enough to stand in for the reality of California and eliminate the need for an interpreter.

The problem, he says, is that the map is not and never can be reality. There is always a gap:

Denying the interpretant does not remove the gap. It remains intractably present. If the inexorable uncertainty, complexity, and information overflows of the gap are not consciously recognized and transcended, the gap fills up with noise. Congesting the gap are surreptitious assumptions, ideology, bias, manipulation, and static. AI triumphalism allows it to sink into a chaos of constantly changing but insidiously tacit interpretations.

Ultimately AI assumes a single interpretant created by machine learning as it processes ever more zettabytes of data and converges on a single interpretation. This interpretation is always of a rearview mirror. Artificial intelligence is based on an unfathomably complex and voluminous look at the past. But this look is always a compound of slightly wrong measurements, thus multiplying its errors through the cosmos. In the real world, by contrast, where interpretation is decentralized among many individual mindseach person interpreting each symbolmistakes are limited, subject to ongoing checks and balances, rather than being inexorably perpetuated onward.

Does this limitation make a difference in practice? It helps account for the ongoing failure of Big Data to provide consistently meaningful correlations in science, medicine, or economics research. Economics professor Gary Smith puts the problem this way:

Humans naturally assume that all patterns are significant. But AI cannot grasp the meaning of any pattern, significant or not. Thus, from massive number crunches, we may learn (if thats the right word) that

Stock prices can be predicted from Google searches for the word debt.

Stock prices can be predicted from the number of Twitter tweets that use calm words.

An unborn babys sex can be predicted by the amount of breakfast cereal the mother eats.

Bitcoin prices can be predicted from stock returns in the paperboard-containers-and-boxes industry.

Interest rates can be predicted from Trump tweets containing the words billion and great.

If the significance of those patterns makes no sense to you, its not because you are not as smart as the Big Data machine. Those patterns shouldnt make any sense to you. Theres no sense in them because they are meaningless.

Smith, author with Jay Cordes of The Phantom Pattern Problem (Oxford, 2020), explains that these phantom patterns are a natural occurrence within the huge amounts of data that big computers crunch:

even random data contain patterns. Thus the patterns that AI algorithms discover may well be meaningless. Our seduction by patterns underlies the publication of nonsense in good peer-reviewed journals.

Yes, such meaningless findings from Big Data do creep into science and medicine journals. Thats partly a function of thinking that a big computer can do our thinking for us even though it cant recognize the meaning of patterns. Its what happens when there is no interpreter.

Ah, butso we are toldquantum computers will evolve so as to save the dream of true thinking machines. Gilder has thought about that one too. In fact, hes been thinking about it since 1989 when he published Microcosm: The Quantum Era in Economics and Technology.

Its true that, in the unimaginably tiny quantum world, electrons can do things we cant:

A long-ago thought experiment of Einsteins showed that once any two photonsor other quantum entitiesinteract, they remain in each others influence no matter how far they travel across the universe (as long as they do not interact with something else). Schrdinger christened this entanglement: The spinor other quantum attributeof one behaves as if it reacts to what happens to the other, even when the two are impossibly remote.

But, he says, its also true that continuously observing a quantum system will immobilize it (the quantum Zeno effect). As John Wheeler reminded us, we live in a participatory universe where the observer (Peirces interpretant) is critical. So quantum computers, however cool they sound, still play by rules where the interpreter matters.

In any event, at the quantum scale, we are trying to measure atoms and electrons using instruments composed of atoms and electrons (p. 41). That is self-referential and introduces uncertainty into everything: With quantum computing, you still face the problem of creating an analog machine that does not accumulate errors as it processes its data (p. 42). Now we are back where we started: Making the picture within the machine much bigger and more detailed will not make it identical to the reality it is supposed to interpret correctly.

And remember, we still have no idea how to make the Ultimate Smart Machine conscious because we dont know what consciousness is. We do know one thing for sure now: If Peirce is right, we could turn most of the known universe into processors and still not produce an interpreter (the consciousness that understands meaning).

Robert J. Marks points out that human creativity is non-algorithmic and therefore uncomputable. From which Gilder concludes, The test of the new global ganglia of computers and cables, worldwide webs of glass and light and air, is how readily they take advantage of unexpected contributions from free human minds in all their creativity and diversity. These high-entropy phenomena cannot even be readily measured by the metrics of computer science (p. 46).

Its not clear to Gilder that the AI geniuses of Silicon Valley are taking this in. The next Big Fix is always just around the corner and the Big Hype is always at hand.

Meanwhile, the rest of us can ponder an idea from technology philosopher George Dyson, Complex networksof molecules, people or ideasconstitute their own simplest behavioral descriptions. (p. 53) He was explaining why analog quantum computers would work better than digital ones. But, considered carefully, his idea also means that you are ultimately the best definition of you. And thats not something that a Big Fix can just get around.

Heres the earlier article: Why AI geniuses think they can create true thinking machines. Early on, it seemed like a string of unbroken successes In Gaming AI, George Gilder recounts the dizzying achievements that stoked the ambitionand the hidden fatal flaw.

See the original post here:
Why AI Geniuses Haven't Created True Thinking Machines - Walter Bradley Center for Natural and Artificial Intelligence

Every Thing You Need to Know About Quantum Computers – Analytics Insight

Quantum computersare machines that use the properties of quantum physics to store data and perform calculations based on the probability of an objects state before it is measured. This can be extremely advantageous for certain tasks where they could vastlyoutperform even the best supercomputers.

Quantum computers canprocess massive and complex datasetsmore efficiently than classical computers. They use the fundamentals of quantum mechanics to speed up the process of solving complex calculations. Often, these computations incorporate a seemingly unlimited number of variables and the potential applications span industries from genomics to finance.

Classic computers, which include smartphones and laptops, carry out logical operations using the definite position of a physical state. They encode information in binary bits that can either be 0s or 1s. In quantum computing, operations instead use the quantum state of an object to produce the basic unit of memory called as a quantum bit or qubit. Qubits are made using physical systems, such as the spin of an electron or the orientation of a photon. These systems can be in many different arrangements all at once, a property known as quantum superposition. Qubits can also be inextricably linked together using a phenomenon called quantum entanglement. The result is that a series of qubits can represent different things simultaneously. These states are the undefined properties of an object before theyve been detected, such as the spin of an electron or the polarization of a photon.

Instead of having a clear position, unmeasured quantum states occur in a mixed superposition that can be entangled with those of other objects as their final outcomes will be mathematically related even. The complex mathematics behind these unsettled states of entangled spinning coins can be plugged into special algorithms to make short work of problems that would take a classical computer a long time to work out.

American physicist andNobel laureate Richard Feynmangave a note about quantum computers as early as 1959. He stated that when electronic components begin to reach microscopic scales, effects predicted by quantum mechanics occur, which might be exploited in the design of more powerful computers.

During the 1980s and 1990s, the theory of quantum computers advanced considerably beyond Feynmans early speculation. In 1985,David Deutschof the University of Oxford described the construction of quantum logic gates for a universal quantum computer.Peter Shor of AT&T devised an algorithmto factor numbers with a quantum computer that would require as few as six qubits in 1994. Later in 1998, Isaac Chuang of Los Alamos National Laboratory, Neil Gershenfeld of Massachusetts Institute of Technology (MIT) and Mark Kubince of the University of Californiacreated the first quantum computerwith 2 qubits, that could be loaded with data and output a solution.

Recently, Physicist David Wineland and his colleagues at the US National Institute for Standards and Technology (NIST) announced that they havecreated a 4-qubit quantum computerby entangling four ionized beryllium atoms using an electromagnetic trap. Today, quantum computing ispoised to upend entire industriesstarting from telecommunications to cybersecurity, advanced manufacturing, finance medicine and beyond.

There are three primary types of quantum computing. Each type differs by the amount of processing power (qubits) needed and the number of possible applications, as well as the time required to become commercially viable.

Quantum annealing is best for solving optimization problems. Researchers are trying to find the best and most efficient possible configuration among many possible combinations of variables.

Volkswagen recently conducted a quantum experiment to optimize traffic flows in the overcrowded city of Beijing, China. The experiment was run in partnership with Google and D-Wave Systems. Canadian company D-Wave developed quantum annealer. But, it is difficult to tell whether it actually has any real quantumness so far. The algorithm could successfully reduce traffic by choosing the ideal path for each vehicle.

Quantum simulations explore specific problems in quantum physics that are beyond the capacity of classical systems. Simulating complex quantum phenomena could be one of the most important applications of quantum computing. One area that is particularly promising for simulation is modeling the effect of a chemical stimulation on a large number of subatomic particles also known as quantum chemistry.

Universal quantum computers are the most powerful and most generally applicable, but also the hardest to build. Remarkably, a universal quantum computer would likely make use of over 100,000 qubits and some estimates put it at 1M qubits. But to the disappointment, the most qubits we can access now is just 128. The basic idea behind the universal quantum computer is that you could direct the machine at any massively complex computation and get a quick solution. This includes solving the aforementioned annealing equations, simulating quantum phenomena, and more.

See original here:
Every Thing You Need to Know About Quantum Computers - Analytics Insight

Quantum Computing and the Cryptography Conundrum – CXOToday.com

By: Anand Patil

On October 23, 2019, researchers from Google made an official announcement of a major breakthrough one that scientists compared to the Wright Brothers first flight, or even mans first moon landing. They said to have achieved Quantum Supremacy, meaning that they had created a Quantum Computer that could perform a calculation that is considered impossible by the classical computers of today. The announcement was a landmark, highlighting the possibilities of Quantum Computing.

The concept of Quantum Computing itself isnt new. It is a field that has been a point of interest of physicists and computer researchers since the 1980s. Googles announcement, however, has brought it to the mainstream, and shone a spotlight on the promise that this niche field of innovation holds. Of course, like someone once said, with great power comes with great responsibility, so this field isnt without complexities.

The Possibilities of Quantum Computing

Quantum Computing is a branch of computer science that is focused on leveraging the principles of quantum physics to develop computer technology. Quantum Computers hold the promise to power major advances in various fields that require complex calculations from materials science and pharmaceuticals to aerospace and artificial intelligence (AI).

So far, Quantum Computers have been nothing more than fancy laboratory experiments large and expensive but they have successfully demonstrated that the underlying principles are sound and have the potential to transform industries and accelerate innovation like never before. This has spurred scientific and industrial interest in this nascent field, giving rise to multiple projects across the world in pursuit of creating a viable, general-use Quantum Computer. That said, it may still be many years before Quantum Computers are commercially and generally available.

So Why Does It Matter Today?The possibility of Quantum Computers poses a serious challenge to cryptographic algorithms deployed widely today. Todays key-exchange algorithms, like RSA, Diffie-Hellman, and others, rely on very difficult mathematical problems such as prime factorization for their security, which a Quantum computer would be able to solve much faster than a classical computer.

For example, it would take a classical computer centuries or even longer, to break modern algorithms like DH, RSA-2048 etc. by using brute-force methods. However, given the power and efficiency of quantum machines in calculations such as finding prime factors of large numbers it may be possible for a quantum computer to break current asymmetric algorithms in a matter of days

So, while the encrypted internet is not at risk at the moment, all that a bad actor has to do is capture the encrypted data today including the initial key exchange, and then wait until a powerful enough quantum computer is available to decrypt it. This is particularly a problem for organizations that have large amounts of sensitive data that they need to protect over the long term such as Banks, Governments and Defense agencies.

What Can I Do Now?For organizations that could be at risk in the future, this is the best time to start evaluating post-quantum cryptography. Simply put, this means moving to algorithms and/or keys that are a lot more robust and can withstand a brute-force attack by a quantum computer i.e. quantum resistant.

The National Institute of Standards and Technology (NIST) in the US is leading the effort towards the standardization of post-quantum secure algorithms. However, given the lengthy process involved, this may take many years to fructify.

An alternative is to use Quantum Key Distribution (QKD) techniques with existing algorithms that are considered quantum-safe. This involves using a dedicated optical channel to exchange keys using the quantum properties of photons. Any attempt to tap this secure channel will lead to a change in the quantum state of the photon and can be immediately detected and therefore the key is unhackable. One of the limitations of QKD in this method is the need for a dedicated optical channel that cannot span more than 50km between the two terminals. Of course, this also means that the existing encryption devices or routers should be capable of ingesting such Quantum-Generated keys.

Post-Quantum Cryptography and CiscoCisco is an active contributor to the efforts to standardize post-quantum algorithms. However, recognizing that an implementable standard may be some years away, there is work ongoing to ensure that organizations are able to implement quantum-resistant encryption techniques in the interim, that leverage existing network devices like routers which are most commonly used as encryptors.

To start with, a team of veteran technical leaders and cryptography experts from Cisco US David McGrew, Scott Fluhrer, Lionel Florit and the engineering team in Cisco India lead by Amjad Inamdar and Ramas Rangaswamy developed an API interface called the Secure Key Import Protocol or SKIP through which Cisco routers can securely ingest keys from an external post-quantum key source. This allows existing Cisco routers to be quantum-ready, with just the addition of an external QKD system. Going forward, this team is working on a way to deliver quantum-safe encryption keys without the need for short-range point-to-point connections.

The advantage of this method is that organizations can integrate post-quantum key sources with existing networking gear in a modular fashion without the need to replace anything already installed. In this manner, you could create a quantum-ready network for all traffic with minimal effort.

Getting Ready for the Post-Quantum WorldQuantum Supremacy is an event which demonstrates that a quantum machine is able to solve a problem that no classical computer can solve in a feasible amount of time. This race has gathered momentum in the recent past with several companies joining the bandwagon, and some even claiming to have achieved it.

There is an unprecedented amount of attention focused on making a commercially viable quantum computer. Many believe it is inevitable, and only a question of time. When it does happen, the currently used cryptography techniques will become vulnerable, and therefore be limited in their security. The good news is, there are methods available to adopt strong encryption techniques that will remain secure even after quantum computers are generally available.

If you are an organization that wants to protect its sensitive data over the long term, you should start to evaluate post-quantum secure encryption techniques today. By leveraging existing networking infrastructure and adding suitable post-quantum key distribution techniques, it is possible to take a quantum leap in securing your data.

(The author is Director, Systems Engineering, Cisco India and SAARC and the views expressed in this article are his own)

Read the original post:
Quantum Computing and the Cryptography Conundrum - CXOToday.com

Quantum computing will impact the enterprise–we just don’t know how – TechRepublic

Quantum computing promises to take on problems that were previously unsolvable. This whole new level of compute power will make it possible to crunch incredible volumes of data that traditional computers cant manage. It will allow researchers to develop new antibiotics, polymers, electrolytes, and so much more.

While the options for quantum computing uses may seem endless, the enterprise is still deciding if this is all just a pipe dream or a future reality.

TechRepublic Premium recently surveyed 598 professionals to learn what they know about quantum computing and what they dont. This report will fill in some of those gaps.

The survey asked the following questions:

Quantum computing is unknown territory for almost all of the survey respondents, as 90% stated that they had little to no understanding of the topic. In fact, only 11% of the 598 respondents said they had an excellent understanding of quantum computing.

Further, 36% of respondents said they were not sure which company was leading the race to develop a quantum computer. IBM got 28% of the votes, and Google got 18%. 1QBit and D-Wave each got 6% of votes. Honeywell came in at 3%.

In terms of industry impact, more than half of the respondents (58%) said that quantum computing will have either a significant impact or somewhat of an impact on the enterprise. While all industries will benefit through different use cases because quantum computing allows data to be consumed and processed faster while using less energy, 42% of survey respondents said IT would benefit the most. The pharmaceutical and finance sectors followed at 14% and 12%, respectfully.

To read all of the survey results, plus analysis, download the full report.

Read the original:
Quantum computing will impact the enterprise--we just don't know how - TechRepublic

IBM and Mastercard among partners of 11.1m Irish quantum project – Siliconrepublic.com

A new 11.1m project has launched with the aim of uniting Irelands various quantum computer research groups.

Some of the biggest names in tech and research have joined forces with the aim of bolstering Irelands quantum computer efforts. The 11.1m Quantum Computing in Ireland (QCoir) initiative will work on a software platform integrating multiple quantum bit technologies being developed in Ireland.

Unlike a traditional binary computer that uses binary bits which can be either one or zero a quantum bit (qubit) can be one, zero or both at the same time. This gives quantum computers the power to solve some of the worlds most complex problems in a fraction of the time that it would take a binary computer.

QCoir partners include Equal1 Labs, IBM, Rockley Photonics, Maynooth University, the Tyndall National Institute, University College Dublin and Mastercard. The project received 7.3m in funding under the Disruptive Technologies Innovation Fund, a 500m fund established under Project Ireland 2040.

Quantum computing is seen as the future of computer technology, said Dr Emanuele Pelucchi, head of epitaxy and physics of nanostructures at Tyndall, based at University College Cork.

Its computing built on the principles of quantum physics, creating, storing and accessing data at atomic and subatomic levels to create vastly powerful computers.

Sources of multiple entangled photons uniquely allow for preparation of highly entangled quantum states. QCoir will leverage the on-chip photonic qubit platform based on site-controlled III-V quantum dots. These unique dots were developed at Tyndall.

Tyndalls CEO, Prof William Scanlon, added that the partnership will set the foundations for a national quantum ecosystem.

It brings together hardware and software providers with application users, and sees multinationals working side by side with researchers and SMEs, he said.

These kinds of industry and academic research partnerships are what will allow Ireland to build a quantum value proposition at international scale.

Quantum computing research is continuing to progress in Ireland. Earlier this year, a team from Trinity College Dublin said it had taken a major step towards the holy grail of quantum computing: a stable, small-scale quantum computer.

See the original post:
IBM and Mastercard among partners of 11.1m Irish quantum project - Siliconrepublic.com