Archive for the ‘Quantum Computing’ Category

Google and IBM square off in Schrodingers catfight over quantum supremacy – The Register

Column Just before Christmas, Google claimed quantum supremacy. The company had configured a quantum computer to produce results that would take conventional computers some 10,000 years to replicate - a landmark event.

Bollocks, said IBM - which also has big investments both in quantum computing and not letting Google get away with stuff. Using Summit, the world's largest conventional supercomputer at the Oak Ridge National Laboratories in Tennessee, IBM claimed it could do the same calculation in a smidge over two days.

As befits all things quantum, the truth is a bit of both. IBM's claim is fair enough - but it's right at the edge of Summit's capability and frankly a massive waste of its time. Google could, if it wished, tweak the quantum calculation to move it out of that range. And it might: the calculation was chosen precisely not because it was easy, but because it was hard. Harder is better.

Google's quantum CPU has 54 qubits, quantum bits that can stay in a state of being simultaneously one and zero. The active device itself is remarkably tiny, a silicon chip around a centimetre square, or four times the size of the Z80 die in your childhood ZX Spectrum. On top of the silicon, a nest of aluminium tickled by microwaves hosts the actual qubits. The aluminium becomes superconducting below around 100K, but the very coldest part of the circuit is just 15 millikelvins. At this temperature the qubits have low enough noise to survive long enough to be useful

By configuring the qubits in a circuit, setting up data and analysing the patterns that emerge when the superpositions are observed and thus collapse to either one or zero, Google can determine the probable correct outcome for the problem the circuit represents. 54 qubits, if represented in conventional computer terms, would need 254 bits of RAM to represent each step of the calculation, or two petabytes' worth. Manipulating this much data many times over gives the 10 millennia figure Google claims.

IBM, on the other hand, says that it has just enough disk space on Summit to store the complete calculation. However you do it, though, it's not very useful; the only application is in random number generation. That's a fun, important and curiously nuanced field, but you don't really need a refrigerator stuffed full of qubits to get there. You certainly don't need the 27,648 NVidia Tesla GPUs in Summit chewing through 16 megawatts of power.

What Google is actually doing is known in the trade as "pulling a Steve", from the marketing antics of the late Steve Jobs. In particular, his tour at NeXT Inc, the company he started in the late 1980s to annoy Apple and produce idiosyncratic workstations. Hugely expensive to make and even more so to buy, the NeXT systems were never in danger of achieving dominance - but you wouldn't know that from Jobs' pronouncements. He declared market supremacy at every opportunity, although in carefully crafted phrases that critics joked defined the market as "black cubic workstations running NeXTOS."

Much the same is true of Google's claim. The calculation is carefully crafted to do precisely the things that Google's quantum computer can do - the important thing isn't the result, but the journey. Perhaps the best analogy is with the Wright Brothers' first flight: of no practical use, but tremendous significance.

What happened to NeXT? It got out of hardware and concentrated on software, then Jobs sold it - and himself - to Apple, and folded in some of that software into MacOS development. Oh, and some cat called Berners-Lee built something called the World Wide Web on a Next Cube.

Nothing like this will happen with Google's technology. There's no new web waiting to be borne on the wings of supercooled qubits. Even some of the more plausible things, like quantum decryption of internet traffic, is a very long way from reality - and, once it happens, it's going to be relatively trivial to tweak conventional encryption to defeat it. But the raw demonstration, that a frozen lunchbox consuming virtually no power in its core can outperform a computer chewing through enough wattage to keep a small town going, is a powerful inducement for more work.

That's Google's big achievement. So many new and promising technologies have failed not because they could never live up to expectations but because they cant survive infancy. Existing, established technology has all the advantages: it generates money, it has distribution channels, it has an army of experts behind it, and it can adjust to close down challengers before they get going. To take just one company - Intel has tried for decades to break out of the x86 CPU prison. New wireless standards, new memory technologies, new chip architectures, new display systems, new storage and security ideas - year after year, the company casts about for something new that'll make money. It never gets there.

Google's "quantum supremacy" isn't there either, but it has done enough to protect its infant prince in its superconducting crib. That's worth a bit of hype.

Continued here:
Google and IBM square off in Schrodingers catfight over quantum supremacy - The Register

Charles Hoskinson Predicts Economic Collapse, Rise of Quantum Computing, Space Travel and Cryptocurrency in the 2020s – The Daily Hodl

The new decade will unfurl a bag of seismic shifts, predicts the creator of Cardano and Ethereum, Charles Hoskinson. And these changes will propel cryptocurrency and blockchain solutions to the forefront as legacy systems buckle, transform or dissolve.

In an ask-me-anything session uploaded on January 3rd, the 11th birthday of Bitcoin, Hoskinson acknowledges how the popular cryptocurrency gave him an eye-opening introduction to the world of global finance, and he recounts how dramatically official attitudes and perceptions have changed.

Every central bank in the world is aware of cryptocurrencies and some are even taking positions in cryptocurrencies. Theres really never been a time in human history where one piece of technology has obtained such enormous global relevance without any central coordinated effort, any central coordinated marketing. No company controls it and the revolution is just getting started.

And he expects its emergence to coalesce with other epic changes. In a big picture reveal, Hoskinson plots some of the major events he believes will shape the new decade.

2020 Predictions

Hoskinson says the consequences of these technologies will reach every government service and that cryptocurrencies will gain an opening once another economic collapse similar to 2008 shakes the markets this decade.

I think that means its a great opening for cryptocurrencies to be ready to start taking over the global economy.

Hoskinson adds that hes happy to be alive to witness all of the changes he anticipates, including a reorganization of the media.

This is the last decade of traditional organized media, in my view. Were probably going to have less CNNs and Fox Newses and Bloombergs and Wall Street Journals and more Joe Rogans, especially as we enter the 2025s and beyond. And I think our space in particular is going to fundamentally change the incentives of journalism. And well actually move to a different way of paying for content, curating content.

Check Latest News Headlines

Featured Image: Shutterstock/Liu zishan

Read more:
Charles Hoskinson Predicts Economic Collapse, Rise of Quantum Computing, Space Travel and Cryptocurrency in the 2020s - The Daily Hodl

World High Performance Computing (HPC) Markets to 2025 – AI, IoT, and 5G will be Major Drivers for HPC Growth as they Facilitate the Need to Process…

DUBLIN, Jan. 9, 2020 /PRNewswire/ -- The "High Performance Computing (HPC) Market by Component, Infrastructure, Services, Price Band, HPC Applications, Deployment Types, Industry Verticals, and Regions 2020-2025" report has been added to ResearchAndMarkets.com's offering.

This report evaluates the HPC market including companies, solutions, use cases, and applications. Analysis includes HPC by organizational size, software and system type, server type, and price band, and industry verticals. The report also assesses the market for integration of various artificial intelligence technologies in HPC. It also evaluates the exascale-level HPC market including analysis by component, hardware type, service type, and industry vertical.

High Performance Computing (HPC) may be provided via a supercomputer or via parallel processing techniques such as leveraging clusters of computers to aggregate computing power. HPC is well-suited for applications that require high performance data computation such as certain financial services, simulations, and various R&D initiatives.

The market is currently dominated on the demand side by large corporations, universities, and government institutions by way of capabilities that are often used to solve very specific problems for large institutions. Examples include financial services organizations, government R&D facilities, universities research, etc.

However, the cloud-computing based as a Service model allows HPC market offerings to be extended via HPC-as-a-Service (HPCaaS) to a much wider range of industry verticals and companies, thereby providing computational services to solve a much broader array of problems. Industry use cases are increasingly emerging that benefit from HPC-level computing, many of which benefit from split processing between localized device/platform and HPCaaS.

In fact, HPCaaS is poised to become much more commonly available, partially due to new on-demand supercomputer service offerings, and in part as a result of emerging AI-based tools for engineers. Accordingly, up to 45% of revenue will be directly attributable to the cloud-based business model via HPCaaS, which makes High-Performance Computing solutions available to a much wider range of industry verticals and companies, thereby providing computational services to solve a much broader array of problems.

In a recent study, we conducted interviews with major players in the market as well as smaller, lesser known companies that are believed to be influential in terms of innovative solutions that are likely to drive adoption and usage of both cluster-based HPC and supercomputing.

In an effort to identify growth opportunities for the HPC market, we investigated market gaps including unserved and underserved markets and submarkets. The research and advisory firm uncovered a market situation in which HPC currently suffers from an accessibility problem as well as inefficiencies and supercomputer skill gaps.

Stated differently, the market for HPC as a Service (e.g. access to high-performance computing services) currently suffers from problems related to the utilization, scheduling, and set-up time to run jobs on a supercomputer. We identified start-ups and small companies working to solve these problems.

One of the challenge areas identified is low utilization but (ironically) also high wait times for most supercomputers. Scheduling can be a challenge in terms of workload time estimation. About 20% of jobs are computationally heavy 30% of jobs cannot be defined very well in terms of how long jobs will take (within 3-minute window at best). In many instances, users request substantive resources and don't actually use computing time.

In addition to the scheduling challenge, we also identified a company focused on solving additional problems such as computational planning and engineering. We spoke with the principal of a little-known company called Microsurgeonbot, Inc. (doing business as MSB.ai), which is developing a tool for setting up computing jobs for supercomputers.

The company is working to solve major obstacles in accessibility and usability for HPC resources. The company focuses on solving a very important problem in HPC: Supercomputer job set-up and skills gap. Their solution known as "Guru" is poised to make supercomputing much more accessible, especially to engineers in small to medium-sized businesses that do not have the same resources or expertise as large corporate entities.

Key Topics Covered

1 Executive Summary1.1 Companies in Report1.2 Target Audience1.3 Methodology

2 Introduction2.1 Next Generation Computing2.2 High Performance Computing2.2.1 HPC Technology2.2.1.1 Supercomputers2.2.1.2 Computer Clustering2.2.2 Exascale Computation2.2.2.1 United States2.2.2.2 China2.2.2.3 Europe2.2.2.4 Japan2.2.2.5 India2.2.2.6 Taiwan2.2.3 High Performance Technical Computing2.2.4 Market Segmentation Considerations2.2.4.1 Government, NGOs, and Universities2.2.4.2 Small Companies and Middle Market2.2.5 Use Cases and Application Areas2.2.5.1 Computer Aided Engineering2.2.5.2 Government2.2.5.3 Financial Services2.2.5.4 Education and Research2.2.5.5 Manufacturing2.2.5.6 Media and Entertainment2.2.5.7 Electronic Design Automation2.2.5.8 Bio-Sciences and Healthcare2.2.5.9 Energy Management and Utilities2.2.5.10 Earth Science2.2.6 Regulatory Framework2.2.7 Value Chain Analysis2.2.8 AI to Drive HPC Performance and Adoption

3 High Performance Computing Market Analysis and Forecast 2020-20253.1 Global High Performance Computing Market 2020-20253.1.1 Total High Performance Computing Market 2020-20253.1.2 High Performance Computing Market by Component 2020-20253.1.2.1 High Performance Computing Market by Hardware and Infrastructure Type 2020-20253.1.2.1.1 High Performance Computing Market by Server Type 2020-20253.1.2.2 High Performance Computing Market by Software and System Type 2020-20253.1.2.3 High Performance Computing Market by Professional Service Type 2020-20253.1.3 High Performance Computing Market by Deployment Type 2020-20253.1.4 High Performance Computing Market by Organization Size 2020-20253.1.5 High Performance Computing Market by Server Price Band 2020-20253.1.6 High Performance Computing Market by Application Type 2020-20253.1.6.1 High Performance Technical Computing Market by Industry Vertical 2020-20253.1.6.2 Critical High Performance Business Computing Market by Industry Vertical 2020-20253.1.1 High Performance Computing Deployment Options: Supercomputer vs. Clustering 2020-20253.1.2 High Performance Computing as a Service (HPCaaS) 2020-20253.1.3 AI Powered High Performance Computing Market3.1.3.1 AI Powered High Performance Computing Market by Component3.1.3.2 AI Powered High Performance Computing Market by AI Technology3.2 Regional High Performance Computing Market 2020-20253.3 Exascale Computing Market 2020-20253.3.1 Exascale Computing Driven HPC Market by Component 2020-20253.3.2 Exascale Computing Driven HPC Market by Hardware Type 2020-20253.3.3 Exascale Computing Driven HPC Market by Service Type 2020-20253.3.4 Exascale Computing Driven HPC Market by Industry Vertical 2020-20253.3.1 Exascale Computing as a Service 2020-2025

4 High Performance Computing Company Analysis4.1 HPC Vendor Ecosystem4.2 Leading HPC Companies4.2.1 Amazon Web Services Inc.4.2.2 Atos SE4.2.3 Adavnced Micro Devices Inc.4.2.4 Cisco Systems4.2.5 DELL Technologies Inc.4.2.6 Fujitsu Ltd.4.2.7 Hewlett Packard Enterprise (HPE)4.2.8 IBM Corporation4.2.9 Intel Corporation4.2.10 Microsoft Corporation4.2.11 NEC Corporation4.2.12 NVIDIA4.2.13 Rackspace Inc.4.1 Companies to Watch4.1.1 Braket Inc.4.1.1 MicroSurgeonBot Inc. (MSB.ai)

5 Conclusions and Recommendations5.1 AI to Support Adoption and Usage of HPC5.2 5G and 6G to Drive Increased Demand for HPC

6 Appendix: Future of Computing6.1 Quantum Computing6.1.1 Quantum Computing Technology6.1.2 Quantum Computing Considerations6.1.3 Market Challenges and Opportunities6.1.4 Recent Developments6.1.5 Quantum Computing Value Chain6.1.6 Quantum Computing Applications6.1.7 Competitive Landscape6.1.8 Government Investment in Quantum Computing6.1.9 Quantum Computing Stakeholders by Country6.1 Other Future Computing Technologies6.1.1 Swarm Computing6.1.2 Neuromorphic Computing6.1.3 Biocomputing6.2 Market Drivers for Future Computing Technologies6.2.1 Efficient Computation and High Speed Storage6.2.2 Government and Private Initiatives6.2.3 Flexible Computing6.2.4 AI-enabled, High Performance Embedded Devices, Chipsets, and ICs6.2.5 Cost Effective Computing powered by Pay-as-you-go Model6.3 Future Computing Market Challenges6.3.1 Data Security Concerns in Virtualized and Distributed Cloud6.3.2 Funding Constrains R&D Activities6.3.3 Lack of Skilled Professionals across the Sector6.3.4 Absence of Uniformity among NGC Branches including Data Format

For more information about this report visit https://www.researchandmarkets.com/r/xa4mit

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and Markets Laura Wood, Senior Manager press@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

SOURCE Research and Markets

http://www.researchandmarkets.com

Excerpt from:
World High Performance Computing (HPC) Markets to 2025 - AI, IoT, and 5G will be Major Drivers for HPC Growth as they Facilitate the Need to Process...

Quantum Computers Finally Beat Supercomputers in 2019 – Discover Magazine

In his 2013 book, Schrdingers Killer App, Louisiana State University theoretical physicist Jonathan Dowling predicted what he called super exponential growth. He was right. Back in May, during Googles Quantum Spring Symposium, computer engineer Hartmut Neven reported the companys quantum computing chip had been gaining power at breakneck speed.

The subtext: We are venturing into an age of quantum supremacy the point at which quantum computers outperform the best classical supercomputers in solving a well-defined problem.

Engineers test the accuracy of quantum computing chips by using them to solve a problem, and then verifying the work with a classical machine. But in early 2019, that process became problematic, reported Neven, who runs Googles Quantum Artificial Intelligence Lab. Googles quantum chip was improving so quickly that his group had to commandeer increasingly large computers and then clusters of computers to check its work. Its become clear that eventually, theyll run out of machines.

Case in point: Google announced in October that its 53-qubit quantum processor had needed only 200 seconds to complete a problem that would have required 10,000 years on a supercomputer.

Nevens group observed a double exponential growth rate in the chips computing power over a few months. Plain old exponential growth is already really fast: It means that from one step to the next, the value of something multiplies. Bacterial growth can be exponential if the number of organisms doubles during an observed time interval. So can computing power of classical computers under Moores Law, the idea that it doubles roughly every year or two. But under double exponential growth, the exponents have exponents. That makes a world of difference: Instead of a progression from 2 to 4 to 8 to 16 to 32 bacteria, for example, a double-exponentially growing colony in the same time would grow from 2 to 4 to 16 to 256 to 65,536.

Neven credits the growth rate to two factors: the predicted way that quantum computers improve on the computational power of classical ones, and quick improvement of quantum chips themselves. Some began referring to this growth rate as Nevens Law. Some theorists say such growth was unavoidable.

We talked to Dowling (who suggests a more fitting moniker: the Dowling-Neven Law) about double exponential growth, his prediction and his underappreciated Beer Theory of Quantum Mechanics.

Q: You saw double exponential growth on the horizon long before it showed up in a lab. How?

A: Anytime theres a new technology, if it is worthwhile, eventually it kicks into exponential growth in something. We see this with the internet, we saw this with classical computers. You eventually hit a point where all of the engineers figure out how to make this work, miniaturize it and then you suddenly run into exponential growth in terms of the hardware. If it doesnt happen, that hardware falls off the face of the Earth as a nonviable technology.

Q: So you werent surprised to see Googles chip improving so quickly?

A: Im only surprised that it happened earlier than I expected. In my book, I said within the next 50 to 80 years. I guessed a little too conservatively.

Q: Youre a theoretical physicist. Are you typically conservative in your predictions?

People say Im fracking nuts when I publish this stuff. I like to think that Im the crazy guy that always makes the least conservative prediction. I thought this was far-out wacky stuff, and I was making the most outrageous prediction. Thats why its taking everybody by surprise. Nobody expected double exponential growth in processing power to happen this soon.

Q: Given that quantum chips are getting so fast, can I buy my own quantum computer now?

A: Most of the people think the quantum computer is a solved problem. That we can just wait, and Google will sell you one that can do whatever you want. But no. Were in the [prototype] era. The number of qubits is doubling every six months, but the qubits are not perfect. They fail a lot and have imperfections and so forth. But Intel and Google and IBM arent going to wait for perfect qubits. The people who made the [first computers] didnt say, Were going to stop making bigger computers until we figure out how to make perfect vacuum tubes.

Q: Whats the big deal about doing problems with quantum mechanics instead of classical physics?

A: If you have 32 qubits, its like you have 232 parallel universes that are working on parts of your computation. Or like you have a parallel processor with 232 processors. But you only pay the electric bill in our universe.

Q: Quantum mechanics gets really difficult, really fast. How do you deal with that?

A: Everybody has their own interpretation of quantum mechanics. Mine is the Many Beers Interpretation of Quantum Mechanics. With no beer, quantum mechanics doesnt make any sense. After one, two or three beers, it makes perfect sense. But once you get to six or 10, it doesnt make any sense again. Im on my first bottle, so Im in the zone.

[This story originally appeared in print as "The Rules of the Road to Quantum Supremacy."]

See original here:

Quantum Computers Finally Beat Supercomputers in 2019 - Discover Magazine

Quantum computing : Solving problems beyond the power of classical computing – Economic Times

Weather forecasting today is good. Can it get better? Sure, it can, if computers can be better. This is where quantum computers come into the picture. They possess computing capacity beyond anything that todays classical computers can ever achieve. This is because quantum computers can run calculations exponentially faster than todays conventional binary computers. That makes them powerful enough to bridge gaps which exist in todays weather forecasting, drug discovery, financial modelling and many other complex areas.

Classical computing has been the backbone of modern society. It gave us satellite TV, the internet and digital commerce. It put robots on Mars and smartphones in our pockets.

But many of the worlds biggest mysteries and potentially greatest opportunities remain beyond the grasp of classical computers, says Stefan Filipp, quantum scientist at IBM Research. To continue the pace of progress, we need to augment the classical approach with a new platform, one that follows its own set of rules. That is quantum computing.

Classical computing is based on the binary system, where the fundamental carriers of information bits can take on a value of either 0 or 1.

All information is stored and read as a sequence of 0s and 1s. A state of 0 is off (or false) and a state of 1 is on (or true). Unlike bits, quantum bits or qubits can have multiple values or states between 0 and 1, enabling them to store different types of information.

Superposition and entanglement are two fundamental properties of quantum objects. The ability to manipulate these properties is what makes quantum algorithms fundamentally different from classical algorithms.

Quantum computers working with classical systems have the potential to solve complex real-world problems such as simulating chemistry, modelling financial risk and optimising supply chains.

For example, Exxon Mobil plans to use quantum computing to better understand catalytic and molecular interactions that are too difficult to calculate with classical computers. Potential applications include more predictive environmental models and highly accurate quantum chemistry calculations to enable the discovery of new materials for more efficient carbon capture.

JP Morgan Chase is focusing on use cases for quantum computing in the financial industry, including trading strategies, portfolio optimisation, asset pricing and risk analysis.

In India, the government has launched two initiatives in the emerging field a networked programme on Quantum Information Science and Technology (QuST) and the National Mission on Quantum Technologies & Applications (NMQTA).

Despite all the progress, practical and working quantum systems might take most of the 2020s. And you wont see or need a quantum machine on your desk. These will be used by governments and large enterprises, unless you want to find aliens or figure out and execute ways to boil the ocean while sitting at home.

This story is part of the 'Tech that can change your life in the next decade' package

View original post here:

Quantum computing : Solving problems beyond the power of classical computing - Economic Times