Archive for the ‘Quantum Computing’ Category

Innovations Shaping the Future – Qrius

It took humans over 65 thousand years to invent the wheel and it was one of the most breakthrough technological revolutions in the history of humanity. Today, almost six thousand years later, several not less important revolutions happen each year and the pace of changes continues accelerating exponentially.

New technologies literally change the world making our tomorrow mysterious and unpredictable. But what can we do to embrace the future? Well, we, unfortunately, cannot make accurate predictions, but we can definitely keep track of the trends of software development in Europe appearing on the horizon.

In this article, we outlined technologies that have a potential to spread around the globe in the next 1-10 years. So lets look under the veil of time to come together!

The idea of quantum computing is not quite new. Scientists started thinking about it at the beginning of the 1980s and, it is still in the early stage of research and development. Nevertheless, quantum computing has already become the area of focus in the tech world and its anticipated to turn into ageneraltrend in softwaredevelopmentwithin the next decade. So what is so special about quantum computers?

In quantum computing, data is encoded in so-called quantum bits (qubits) that can be in superpositions of states. In simple words, this means that a qubit can represent 0 and 1 at the same time, unlike regular bits that can be either in a state of 0 or 1. The superpositions allow qubits to increase computing capacity, minimize the risk of error, make accurate predictions and significantly improve data encryption up to the level when its impossible to hack it. Its expected that quantum computers will eventually outperform traditional computers, even the most powerful ones.

The largest tech corporations are now racing to build the first quantum chip that will actually work as it supposed to, but the main problem is that its quite hard to maintain a qubit in a superposition for a long (or at least sufficient) period of time as it tends to collapse into particular state (0 or 1) when external conditions are changed.

Google, Intel, IBM, and Microsoft have already tried to build quantum computers, but all of them have way too few qubits to have a real commercial value. Besides, China is to build a $10 billion laboratory to conduct quantum computing experiments.

Its expected that well have the first real use cases within the next 10 years or so. Yet, the exact date of quantum supremacy, i.e. the time when a quantum computer will be able to do something that common computers cannot do, is still hard to predict.

Voice search is not brand-newtechnology in software developmentbut it is one ofthe most populartech trendsnowadays. The reason is that its expected to replace regular searches via touchscreens and keyboards in the nearest year or two.

The above predictions are based on the statistics that are quite impressive. For example, in 2017, 1 in 4 shoppers in the US used voice assistants to do their holiday shopping and its anticipated that about 50% of all online searches will be voice searches by 2020.

Speaking about voice-activated engines, Amazon Alexa, a virtual assistant developed by Amazon, holds its leadership position on the market, but its rivals do not lag behind. Google, Microsoft and other companies are working to improve the accuracy of speech recognition and expand the domain of possible questions users may ask.

For example, recently, Google announced speakable markup that allows publishers to indicate the parts of articles to be read aloud by Google Assistant, so the later can respond to a users query with the most relevant excerpt and then ask a user if they want to hear the rest of the text.

Thisapp development trendhas also good perspectives inthe long run since chatbots which can recognize and reply to voice commands will eventually replace customer service agents and even the sellers in shopping centers.

Building digital twins is asoftware development trendthat has good chances to conquer manufacturing and engineering industries in the nearest 3-5 years. In simple words, a digital twin is a virtual model of physical buildings, products, systems and processes. Thanks to the numerous sensors and the application of artificial intelligence, digital twins are closely linked to the original objects that allows companies to track the changes in assets and optimize their performance.

Digital twins are widely used in the space industry as the whole idea was developed to help maintain and repair spacecrafts, however, its expected that about 50% of large manufacturers will use this technology by 2021. Hence, exploiting digital twins is turning from atrend in software developmentinto a must-have tool for any business that wants to stand out from competitors.

Virtual Reality (VR) and Augmented Reality (AR) have been used by companies for a few years now, but it seems their popularity hasnt reached its peak yet. According to International Data Corporation, in 2021, total spending on VR and AR projects will constitute around $160 billion that is almost twenty times more than total spending on similar products was in 2017.

Such rapid growth of thesenew technologies in software developmentmay be explained by the fact that the immersive solutions have already proved their commercial relevance and not only in the gaming industry. Businesses no longer look at VR and AR as at interesting-to-try things; companies start considering them as tools to reshape their business activity and take interaction with customers to the next level. Many big brands such as Ikea, Lego, Audi, Toyota and Farber have already followed thistrend in softwaredevelopmentby building AR/VR apps, so it will unlikely take other businesses much time to do the same.

New technologies impact the way we live and do business. In this article, we mentioned only somelatest software development trendswhich we think will influence our future the most. But the best thing about tech disruptions is that you can not only observe them, you can be a part of them.

Stay updated with all the insights.Navigate news, 1 email day.Subscribe to Qrius

Originally posted here:
Innovations Shaping the Future - Qrius

Rockport Networks Announces Availability of New Switchless Network and Collaboration with TACC – HPCwire

OTTAWA, Ontario, Oct. 26, 2021 Rockport Networks today announces the commercial availability of its new switchless network architecture that delivers industry-leading performance and scalability necessary for performance-intensive computing workloads including HPC, AI and ML. Addressing the longstanding data center chokepoints that are throttling innovation, the Rockport Switchless Networkoffers a completely new design, using breakthrough software and data-routing techniques to overcome congestion, delivering predictable performance improvements of more than 3X that of centralized switch-intensive networks.

The Rockport Switchless Network distributes the network switching function to endpoint devices, where these devices (nodes) become the network. By eliminating layers of switches, the Rockport Switchless Network also significantly frees up rack space to be better used by compute and storage, as well as creating savings in associated power, cooling, and administrative overhead. No longer are compute and storage resources starving for data, and researchers have more predictability regarding workload completion time.

Rockport was founded based on the fact that switching, and networking in general, is extremely complicated. Over the years, this complexity has forced organizations to make tradeoffs when it comes to performance at scale, so we decided to make it simpler, said Doug Carwardine, CEO and co-founder, Rockport Networks. We made it our mission to get data from a source to a destination faster than other technologies. Removing the switch was crucial to achieve significant performance advantages in an environmentally and commercially sustainable way.

Rethinking network switches creates an opportunity to leverage direct interconnect topologies that provide a connectivity mesh in which every network endpoint can efficiently forward traffic to every other endpoint. The Rockport Switchless Network is a distributed, highly reliable, high-performance interconnect providing pre-wired supercomputer topologies through a standard plug-and-play Ethernet interface.

TheRockportNetwork Operating System (rNOS)is software at the core of the Rockport Switchless Network and runs on the Network Card, fully offloaded from the compute cores and server operating system. The rNOS enables the network to self-discover, self-configure and self-heal. Like a navigation app for data, this patented approach selects and continually optimizes the best path through the network to minimize congestion and latency, while breaking down packets into smaller pieces (FLITs) to ensure high-priority messages are not blocked by large messages or bulk data transfers. Designed to integrate easily into existing and emerging data centers, the Rockport Switchless Network installs in a fraction of the time required to cable traditional switch-based networks. As an embeddable architecture, it will work in any form factor. Today the software is deployed and managed using three main components:

Rockports technology delivers significant TCO, sustainability and security benefits including:

When the root of the problem is the architecture, building a better switch just didnt make sense, said Matt Williams, CTO, Rockport Networks. With sophisticated algorithms and other purpose-built software breakthroughs, we have solved for congestion, so our customers no longer need to just throw bandwidth at their networking issues. Weve focused on real-world performance requirements to set a new standard for what the market should expect for the fabrics of the future.

After an extensive beta program for cloud service providers and elite government and academic research labs, the Rockport Switchless Network is being deployed by customers including theUniversity ofTexas Advanced Computing Center (TACC).The company is also working with industry organizations includingOhio State University (OSU)to contribute to performance-intensive networking standards.

TACC Establishes New Center of Excellence with Rockport Networks

As part of todays news, Rockport Networks is announcing a collaboration with TACC to create a Center of Excellence in Austin, Texas. TACC houses Frontera, the fastest supercomputer on a university campus and the 10thmost powerful supercomputer in the world. TACC has installed 396 nodes on Frontera running production workloads on Rockports switchless network including quantum computing, pandemic-related life sciences research as well as workloads focused on rapid responses to emergencies like hurricanes, earthquakes, tornadoes, floods, and other large-scale disasters.

TACC is very pleased to be a Rockport Center of Excellence. We run diverse advanced computing workloads which rely on high-bandwidth, low-latency communication to sustain performance at scale. Were excited to work with innovative new technology like Rockports switchless network design, stated Dr. Dan Stanzione, director of TACC and associate vice president for research at UT-Austin. Our team is seeing promising initial results in terms of congestion and latency control. Weve been impressed by the simplicity of installation and management. We look forward to continuing to test on new and larger workloads and expanding the Rockport Switchless Network further into our data center.

Industry Validation

The team at Durham continues to push the bounds when it comes to uncovering next-generation HPC network technologies, said Alastair Basden, DiRAC/Durham University, technical manager of COSMA HPC Cluster. Based on a 6D torus, we found the Rockport Switchless Network to be remarkably easy to setup and install. We looked at codes that rely on point-to-point communications between all nodes with varying packet sizes where typically congestion can reduce performance on traditional networks. We were able to achieve consistent low latency under load and look forward to seeing the impact this will have on even larger-scale cosmology simulations.

Theres been significant evidence that would suggest that switchless architectures have the capacity to significantly up level application performance that traditionally has come at great cost, stated Earl C. Joseph, CEO, Hyperion Research. Making these advances more economically accessible should greatly benefit the global research community and hopefully improve expectations relative to what we can expect from the network when it comes to return-on-research and time-to-results.

Our mission is to provide the advanced computing community with standard libraries such as MVAPICH2 that support the very best possible performance available in the market. We make it a top priority to keep our libraries fresh with innovative approaches, like Rockport Networks new switchless architecture, said DK Panda, professor and distinguished scholar of computer science at the Ohio State University, and lead for the Network-Based Computing Research Group. We look forward to our ongoing partnership with Rockport to define new standards for our upcoming releases.

Availability and Additional Information

The Rockport Switchless Network is available immediately.

For additional information including full product details, please visit:https://rockportnetworks.com.

Rockport Networks Launch Activities

Following the launch of the Rockport Switchless Network, the company will participate in a number of industry events including:

Virtual Roundtable with HPCwire November 10: Rockport joins Dell, TACC, OSU and Hyperion Research for a virtual discussion: Quieting Noisy Neighbors: Keys to Addressing Congestion in Multi-workload Environments.Register here.

Rockport Networks at SC21 November 14-19: Rockport is a sponsor of the SC21 Virtual Exhibit. Rockport experts will be available to answer questions in the virtual exhibit hall and Rockport CTO Matthew Williams will present: Advanced congestion control for addressing network performance bottlenecks using a next generation interconnect, with a live Q&A. Learn more at http://www.rockportnetworks.com/sc21.

About Rockport Networks

Rockport Networks next-generation of high-performance networks unlocks the entire data center to produce more results, faster, and with better economics and environmental sustainability. Modeled after the worlds fastest supercomputers, the Rockport Switchless Network replaces centralized switch architectures with a distributed, high-performance direct interconnect that is self-discovering, self-configuring and self-healing, and that is simple and transparent to operate. By virtually eliminating congestion and latency, data center workloads can be completed significantly faster, enabling organizations to improve ROI and make critical decisions more quickly. Learn more atwww.rockportnetworks.com.

Source: Rockport Networks

Visit link:
Rockport Networks Announces Availability of New Switchless Network and Collaboration with TACC - HPCwire

Computers have no feelings and human intuition will trump AI, says Apple Co-founder and tech entrepreneur Stev – YourStory

Beating someone at a game of chess is not real intelligence, says the storied Co-founder of Apple Computers, Steve Wozniak. Credited with designing the first among personal computers and bringing out Apple I and Apple II, Wozniak is a firm believer in intuition and feelings which have no match to Artificial Intelligence (AI).

The technology entrepreneur and philanthropist who left Apple in 1985 continues to pursue new projects. He co-founded post-secondary education and training platform for software engineering and technology development Woz U in 2017, and more recently, blockchain-based energy-saving platform, Efforce.

Speaking excitedly about a new idea to look at mapping space debris in the near-earth magnetic field, Wozniak says that there are a number of emerging technology solutions that he would have liked to work with if he were born in the 2000s.

He added that Internet-of-Things, especially in a home environment, robotics, building processors, sensors and semiconductors out of different materials, quantum computing and battery technology are some of the other technology challenges which inspire him.

My whole life I have been thinking about comparing computers to the brain if there is a way to build an electric brain.., says Wozniak.

Wozniak also cheekily reminds us that the human brain was faster than the computer when it started out.

Bringing us back to the message, that while a Google image search can identify a dog breed in seconds but does it really know what a dog is?

To log in to our virtual events platform and experience TechSparks 2021 with thousands of other startup-tech enthusiasts from around the world,join here.Don't forget to tag#TechSparks2021when you share your experience, learnings and favourite moments from TechSparks 2021.

For a line-up of all the action-packed sessions at YourStory's flagship startup-tech conference, check outTechSparks 2021website.

Continue reading here:
Computers have no feelings and human intuition will trump AI, says Apple Co-founder and tech entrepreneur Stev - YourStory

Quantum computing: A simple introduction – Explain that Stuff

by Chris Woodford. Last updated: August 4, 2021.

How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there's more number-crunching ability in a 21st-century cellphone than you'd have found in a room-sized, military computer 50 years ago. Yet, despitesuch amazing advances, there are still plenty of complex problemsthat are beyond the reach of even the world's most powerfulcomputersand there's no guarantee we'll ever be able to tacklethem. One problem is that the basic switching and memory units ofcomputers, known as transistors, are now approaching the point wherethey'll soon be as small as individual atoms. If we want computersthat are smaller and more powerful than today's, we'll soon need todo our computing in a radically different way. Entering the realm ofatoms opens up powerful new possibilities in the shape of quantumcomputing, with processors that could work millions of timesfaster than the ones we use today. Sounds amazing, but the trouble isthat quantum computing is hugely more complex than traditionalcomputing and operates in the Alice in Wonderland world of quantumphysics, where the "classical," sensible, everyday laws of physics no longer apply. What isquantum computing and how does it work? Let's take a closer look!

Photo: Quantum computing means storing and processing information using individual atoms, ions, electrons, or photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater complexity of designing computers that can operate in the weird world of quantum physics.

You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play gamesbut it's much moreand much lessthan that. It's more, because it's a completely general-purposemachine: you can make it do virtually anything you like. It'sless, because inside it's little more than an extremely basiccalculator, following a prearranged set of instructions called aprogram. Like the Wizard of Oz, the amazing things you see in front of youconceal some pretty mundane stuff under the covers.

Photo: This is what one transistor from a typical radio circuit board looks like. In computers, the transistors are much smaller than this and millions of them are packaged together onto microchips.

Conventional computers have two tricks that they do really well: they can storenumbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can bedone as a series of additions, for example). Both of a computer's keytricksstorage and processingare accomplished using switchescalled transistors, which are like microscopic versions of theswitches you have on your wall for turning on and off the lights. Atransistor can either be on or off, just as a light can either be litor unlit. If it's on, we can use a transistor to store a number one(1); if it's off, it stores a number zero (0). Long strings of onesand zeros can be used to store any number, letter, or symbol using acode based on binary (so computers store an upper-case letter A as1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 differentcharacters (such as A-Z, a-z, 0-9, and most common symbols).Computers calculate by using circuits called logic gates,which are made from a number of transistors connected together. Logicgates compare patterns of bits, stored in temporary memories calledregisters, and then turn them into new patterns of bitsandthat's the computer equivalent of what our human brains would calladdition, subtraction, or multiplication. In physical terms, thealgorithm that performs a particular calculation takes the form of anelectronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.

The trouble with conventional computers is that they depend onconventional transistors. This might not sound like a problem if yougo by the amazing progress made in electronics over the last fewdecades. When the transistor was invented, back in 1947, the switchit replaced (which was called the vacuum tube) was about asbig as one of your thumbs. Now, a state-of-the-art microprocessor(single-chip computer) packs hundreds of millions (and up to 30 billion) transistors onto a chip of silicon the size of yourfingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the1960s, Intel co-founder Gordon Moore realized that the power ofcomputers doubles roughly 18 monthsand it's been doing so eversince. This apparently unshakeable trend is known as Moore's Law.

Photo: This memory chip from a typical USB stick contains an integrated circuit that can store 512 megabytes of data. That's roughly 500 million characters (536,870,912 to be exact), each of which needs eight binary digitsso we're talking about 4 billion (4,000 million) transistors in all (4,294,967,296 if you're being picky) packed into an area the size of a postage stamp!

It sounds amazing, and it is, but it misses the point. The moreinformation you need to store, the more binary ones and zerosandtransistorsyou need to do it. Since most conventional computers canonly do one thing at a time, the more complex the problem you wantthem to solve, the more steps they'll need to take and the longerthey'll need to do it. Some computing problems are so complex thatthey need more computing power and time than any modern machine couldreasonably supply; computer scientists call those intractableproblems.

As Moore's Law advances, so the number of intractable problemsdiminishes: computers get more powerful and we can do more with them.The trouble is, transistors are just about as small as we can makethem: we're getting to the point where the laws of physics seem likelyto put a stop to Moore's Law. Unfortunately, there are still hugelydifficult computing problems we can't tackle because even the mostpowerful computers find them intractable. That's one of the reasonswhy people are now getting interested in quantum computing.

Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen.

Richard Feynman

Quantum theory is the branch of physics that deals with the world ofatoms and the smaller (subatomic) particles inside them. You mightthink atoms behave the same way as everything else in the world, intheir own tiny little waybut that's not true: on the atomic scale, the rules change and the "classical" laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman,one of the greatest physicists of the 20th century, once put it: "Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen."[1]

If you've studied light, you may already know a bit about quantumtheory. You might know that a beam of light sometimes behaves asthough it's made up of particles (like a steady stream ofcannonballs), and sometimes as though it's waves of energy ripplingthrough space (a bit like waves on the sea). That's called wave-particle dualityand it's one of the ideas that comes to us from quantum theory. It's hard to grasp thatsomething can be two things at oncea particle and awavebecause it's totally alien to our everyday experience: a car isnot simultaneously a bicycle and a bus. In quantum theory, however,that's just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrdinger's cat. Briefly, in the weird world ofquantum theory, we can imagine a situation where something like a catcould be alive and dead at the same time!

What does all this have to do with computers? Suppose we keep on pushingMoore's Lawkeep on making transistors smaller until they get to thepoint where they obey not the ordinary laws of physics (likeold-style transistors) but the more bizarre laws of quantummechanics. The question is whether computers designed this way can dothings our conventional computers can't. If we can predictmathematically that they might be able to, can we actually make themwork like that in practice?

People have been asking those questions for several decades.Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantumcomputing in the 1960s when he proposed that information is a physical entitythat could be manipulated according to the laws of physics.[2] One important consequence of this is that computers waste energy manipulating the bits inside them(which is partly why computers use so much energy and get so hot, even though they appear to be doingnot very much at all). In the 1970s, building on Landauer's work, Bennett showed how a computer could circumventthis problem by working in a "reversible" way, implying that a quantum computer couldcarry out massively complex computations without using massive amounts of energy.[3] In 1980, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principlesof quantum physicsin other words, a quantum Turing machine.[4] The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basic computations. [5] A few years later, Oxford University's David Deutsch(one of the leading lights in quantum computing) outlined thetheoretical basis of a quantum computer in more detail. [6] How did these great scientists imagine that quantum computers might work?

The key features of an ordinary computerbits, registers, logic gates,algorithms, and so onhave analogous features in a quantum computer.Instead of bits, a quantum computer has quantum bits or qubits,which work in a particularly intriguing way. Where a bit can storeeither a zero or a 1, a qubit can store a zero, a one, bothzero and one, or an infinite number of values in betweenandbe in multiple states (store multiple values) at the same time!If that sounds confusing, think back to light being a particle anda wave at the same time, Schrdinger's cat being alive and dead, or acar being a bicycle and a bus. A gentler way to think of the numbersqubits store is through the physics concept of superposition(where two waves add to make a third one that contains both of theoriginals). If you blow on something like a flute, the pipe fills upwith a standing wave: a wave made up of a fundamental frequency (thebasic note you're playing) and lots of overtones or harmonics(higher-frequency multiples of the fundamental). The wave inside thepipe contains all these waves simultaneously: they're added togetherto make a combined wave that includes them all. Qubits usesuperposition to represent multiple states (multiple numeric values)simultaneously in a similar way.

Just as a quantum computer can store multiple numbers at once, so it canprocess them simultaneously. Instead of working in serial (doing aseries of things one at a time in a sequence), it can work inparallel (doing multiple things at the same time). Only when youtry to find out what state it's actually in at any given moment(by measuring it, in other words) does it "collapse" into one of its possible statesandthat gives you the answer to your problem. Estimates suggesta quantum computer's ability to work in parallel would make it millions of times faster thanany conventional computer... if only we could build it! So howwould we do that?

In reality, qubits would have to be stored by atoms, ions (atoms withtoo many or too few electrons), or even smaller things such as electronsand photons (energy packets), so a quantum computer would be almost like a table-topversion of the kind of particle physics experiments they do atFermilab or CERN. Now you wouldn't be racing particles round giantloops and smashing them together, but you would need mechanisms forcontaining atoms, ions, or subatomic particles, for putting them into certainstates (so you can store information), knocking them into other states (so you canmake them process information), and figuring out what their states are after particularoperations have been performed.

Photo: A single atom or ion can be trapped in an optical cavitythe space between mirrorsand controlled by precise pulses from laser beams.

In practice, there are lots of possible ways of containing atoms and changing their states usinglaser beams, electromagneticfields, radio waves, and an assortment of other techniques.One method is to make qubits usingquantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another methodmakes qubits from what are called ion traps: you add or take awayelectrons from an atom to make an ion, hold it steady in a kind of laser spotlight(so it's locked in place like a nanoscopic rabbit dancing in a very bright headlight),and then flip it into different states with laser pulses. In another technique,the qubits are photons inside optical cavities (spaces betweenextremely tiny mirrors). Don't worry if you don't understand; not many people do. Since the entirefield of quantum computing is still largely abstract and theoretical, the only thing we really need to knowis that qubits are stored by atoms or other quantum-scale particles that canexist in different states and be switched between them.

Although people often assume that quantum computers must automatically bebetter than conventional ones, that's by no means certain. So far,just about the only thing we know for certain that a quantum computer could do better than anormal one is factorisation: finding two unknown prime numbers that,when multiplied together, give a third, known number. In 1994,while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computercould follow to find the "prime factors" of a large number, whichwould speed up the problem enormously. [7] Shor's algorithm reallyexcited interest in quantum computing because virtually every moderncomputer (and every secure, online shopping and banking website) usespublic-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentiallyan "intractable" computer problem). If quantum computers couldindeed factor large numbers quickly, today's online security could berendered obsolete at a stroke. But what goes around comes around,and some researchers believe quantum technology will lead tomuch stronger forms of encryption.(In 2017, Chinese researchers demonstrated for the first timehow quantum encryption could be used to make a very secure video callfrom Beijing to Vienna.)

Does that mean quantum computers are better than conventional ones? Notexactly. Apart from Shor's algorithm, and a search method called Grover's algorithm, hardly any other algorithms have been discovered that wouldbe better performed by quantum methods. Given enough time andcomputing power, conventional computers should still be able to solveany problem that quantum computers could solve, eventually. Inother words, it remains to be proven that quantum computers aregenerally superior to conventional ones, especially given the difficulties ofactually building them. Who knows how conventional computers might advancein the next 50 years, potentially making the idea of quantum computers irrelevantand even absurd.

Photo: Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in quantum computers). Photo courtesy of Argonne National Laboratory.

We have decades of experience building ordinary, transistor-based computers with conventional architectures; building quantum machines means reinventing the whole idea of a computer from the bottom up.First, there are the practical difficulties of making qubits, controlling them very precisely, and having enough of them to do really useful things. Next, there's a major difficulty with errors inherent in a quantum system"noise" as this is technically calledwhich seriously compromises any calculations a quantum computer might make.There are ways around this ("quantum error correction"), but they introduce a great deal more complexity.There's also the fundamental issue of how you get data in and out of a quantum computer,which is, itself, a complex computing problem.Some critics believe these issues are insurmountable;others acknowledge the problems but argue the mission is too important to abandon.

Three decades after they were first proposed, quantum computers remainlargely theoretical. Even so, there's been some encouraging progresstoward realizing a quantum machine. There were two impressivebreakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM'sAlmaden Research Center) used five fluorine atoms to make a crude,five-qubit quantum computer. The same year, researchers at LosAlamos National Laboratory figured out how to make a seven-qubitmachine using a drop of liquid. Five years later, researchers at theUniversity of Innsbruck added an extra qubit and produced the firstquantum computer that could manipulate a qubyte (eight qubits),later bumping the number up to 14 qubits.

These were tentative but important first steps.Over the next few years, researchers announced more ambitious experiments, addingprogressively greater numbers of qubits. By 2011, a pioneering Canadiancompany called D-Wave Systems announced in Nature that it had produced a 128-qubitmachine[8]; the announcement proved highly controversialand there was a lot of debate over whether the company's machines had really demonstrated quantum behavior.Three years later, Google announced that it was hiring a team of academics (including University of Californiaat Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave's approach.In March 2015, the Google team announced they were "a step closer to quantum computation," having developeda new way for qubits to detect and protect against errors.In 2016, MIT's Isaac Chuang and scientists from the University of Innsbruckunveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine mightevolve into the long-promised, fully fledged encryption buster.

There's no doubt that these are hugely important advances.and the signs are growing steadily more encouraging that quantumtechnology will eventually deliver a computing revolution.In December 2017, Microsoft unveiled a completequantum development kit, including a new computer language, Q#, developed specifically forquantum applications. In early 2018,D-wave announced plans to start rolling out quantum power to acloud computing platform.A few weeks later, Google announced Bristlecone, a quantum processorbased on a 72-qubit array, that might, one day, form the cornerstone of a quantum computer that could tackle real-world problems.In October 2019, Google announced it had reached another milestone: the achievement of "quantum supremacy" (the point at which a quantumcomputer can beat a conventional machine at a typical computing task),though not everyone was convinced; IBM, for example, disputed the claim.

One thing is beyond dispute: quantum computing is very excitingand you can find out just how exciting by tinkering with it for yourself, In 2019, Amazon's AWS CloudComputing offshoot announced a service called Braket, which gives its users access to quantum computing simulators based on machines being developed by three cutting-edge companies (D-wave, IonQ, and Rigletti). Microsoft's Azure cloud platform offers a rival service called Azure Quantum, while Google's Quantum AI websiteoffers access to its own research and resources. Take your pickor try them all.

Despite all this progress, it's early days for the whole field, and mostresearchers agree that we're unlikely to see practical quantumcomputers appearing for some yearsand more likely several decades.The conclusion reached by an influential National Academies of Sciences, Medicineand Engineering report in December 2018 was that "it is still too early to be able to predict the time horizon for a practical quantum computer" and that "many technical challenges remain to be resolved before we reach this milestone."

Follow this link:
Quantum computing: A simple introduction - Explain that Stuff

INSIDE QUANTUM TECHNOLOGY New York, The Largest Business Quantum Technology Conference and Exhibition, Announces Focus on Quantum Safe Initiatives and…

NEW YORK, Oct. 19, 2021 /PRNewswire/ --3DR Holdings today announced a deep dive into Quantum Safe initiatives and use cases as a prime focus of Inside Quantum Technology, the industry's leading conference and exhibition. Sponsored by IBM, Inside Quantum Technology will run from November 1-5 as a hybrid virtual and in-person event with live sessions in New York City. The conference is dedicated to the business of quantum computing and will feature presentations and discussions critical to those seeking new business revenues from quantum-related opportunities.

Continued developments in quantum computing represent a serious threat to existing encryption systems that protect critical networks and applications. It's against this backdrop that Inside Quantum Technology will focus on technologies being developed to protect these systems, along with an examination of real-world end use cases.

In addition to its world-class conference program, Inside Quantum Technology will provide attendees with opportunities to visit leading vendors in its exhibit hall, both in-person and virtually, where visitors can download materials, watch videos, and connect with company representatives. The event also offers networking opportunities on each day, enabling participants to gather and engage based on specific quantum-related topics.

For additional details about Inside Quantum Technology, including the complete agenda, registration information, sponsorship and exhibition options, please visit https://iqtevent.com/fall/.

About 3DR Holdings3DR Holdings is a technology media organization with website, research and international trade show interests in the fields of Quantum Technology and 3D Printing. For more information, please visit https://3drholdings.com.

About Inside Quantum TechnologyInside Quantum Technology is the only organization worldwide dedicated to meeting the strategic information and analysis needs of the emerging quantum technology sector via events, daily news, research and podcasts. For additional information, please visit https://www.insidequantumtechnology.com.

Media Contact: Barry Schwartz, Schwartz Public Relations[emailprotected], 212-677-8700 ext. 118

SOURCE Inside Quantum Technology

Continue reading here:
INSIDE QUANTUM TECHNOLOGY New York, The Largest Business Quantum Technology Conference and Exhibition, Announces Focus on Quantum Safe Initiatives and...