Archive for the ‘Quantum Computer’ Category

Malaysia: Leveraging On Digitalisation Trends – The ASEAN Post

Analysts and pundits didnt foresee COVID-19 coming in 2020 and that the virus would accelerate the digitalisation trend a seismic or tectonic shift in its own right resulting from the fragmentation of physical processes and the emphases on a low-touch economy as part of compliance to the standard operating procedures (SOP) to break and contain the transmission of the virus.

Not all digitalisation trends are precipitated (in the sense of having their momentum accelerated) by the unprecedented spread of COVID-19 though, as some would have been in the works for years and the breakthroughs only came this year. Likewise, digitalisation trends for 2021 would also reflect similar developments. That is, COVID-19 would have been the impetus and catalyst in contradistinction from cause for the rise of some digitalisation trends whilst others would have already been pursued beforehand.

Lets take a look at some of the digital lessons from 2020 as well as look ahead to 2021.

Cloud Kitchens

COVID-19 has encouraged and enhanced the use of cloud services for physical operations such as in cloud kitchens. What this means is that cooking and delivery services could be centralised rather than having disparate collection points such as various restaurants. The underlying purpose is that dining-in (front-of-house) areas are removed from the overall business process thus saving on costs labour/manpower, operational, overheads, dining assets, etc.

In Malaysia in particular and the region in general, online food delivery businesses such as GrabFood (through Grab e-Kitchen) and FoodPanda have been leveraging on the cloud kitchen concept due to high demand and cost effectiveness. The cloud kitchen trend which came to the fore in 2020 is expected to grow and expand in the Klang Valley in tandem with the overall growth and explosion of e-commerce in the country.

Theres also the trend of hyperconverged infrastructure/technology (HCI) whereby businesses and enterprises can save costs and physical space too. Data management and cloud specialist Nutanix defines HCI as the combination of common datacentre hardware using locally attached storage resources with intelligent software to create flexible building blocks that replace legacy infrastructure consisting of separate servers, storage networks, and storage arrays.

International Data Corporation (IDC), a leading information and communications technology (ICT) market intelligence firm, has predicted that the HCI market will grow to US$7.64 billion in 2021. In Malaysia, local logistics and express carrier giant Gdex has adopted Nutanix Hybrid Cloud to keep up with demands in e-commerce for scalability and business-to-consumer (B2C) operations.

Augmented Reality / Virtual Reality

And then, we have augmented reality (AR)/virtual reality (VR) which is making its presence felt in Malaysias tourism sector. Again, COVID-19 has resulted in partial lockdowns or movement control order (MCO) in Malaysias case, which has massively impacted its tourism sector which is the countrys third major export and foreign exchange earner.

AR/VR is a digital gateway and portal to the on-site tourism experience. Used for marketing and promotional purposes, it allows potential on-site tourists to enjoy an audio-visual sampling of the full package on offer the real world, tactual experience. All one needs to access the virtual experience is a smartphone, laptop, tablet or personal computer (PC).

Moving forward, the Artificial Intelligence of Things (AIoT) which is basically the combination of artificial intelligence (AI) and the Internet of Things (IoT) is making rapid headway. According to futurist Bernard Marr, IoT devices such as sensors, universal remote controllers, and biometric scanners can be likened to a digital nervous system with AI serving as the brain.

When AI is added to the IoT it means that those devices can analyse data and make decisions and act on that data without involvement by humans, explains Marr.

With the advent of 5G technology and smart cities, AIoT is expected to emerge in the near future as part of the new norm in our homes.

Protein Folding

While not exactly a digitalisation trend, the online journal Nature on 30 November reported that after years of pain-staking efforts, an AI called AlphaFold developed by Google offshoot DeepMind has achieved a gargantuan leap in computational biology, namely by determining a proteins 3D shape from its amino-acid sequence or what is popularly known as protein folding where structure is function (an axiom of molecular biology).

As proteins are the building-blocks of life, unravelling their molecular structure would yield insights into the mysteries of life so that finding treatments and cures for intractable diseases such as Parkinsons, producing viral drugs for COVID-19 or identifying suitable enzymes that biodegrade industrial waste, would be possible.

According to the DeepMind website, AlphaFold was taught (via deep learning) by reproducing the sequences and structures of around 100,000 known proteins. Come 2021, we could expect to herald the beginning of a new chapter related to many scientific and industrial applications which hopefully extends to agriculture and food production, air pollution control (carbon capture and storage) and water treatment, among others.

Connected to the AI breakthrough in predicting protein folding is, of course, quantum computingthat represents the leap from bits (binary 0 or 1) to qubits (0 & 1 at the same time) based on quantum physics and mechanics (of the simultaneity-duality of supposition and entanglement). For now, quantum computing can be deployed for complex tasks such as predicting the 3D shape of protein folding and structure.

Blockchain

As for blockchain or distributed ledger technology (DLT), it is fast making a mark in supply chain management (SCM) with the strategic collaboration between public and private sectors. In Malaysia, the use of blockchain by the Royal Malaysian Customs Department (RMCD) will ease and facilitate import-export transactions of private sector stakeholders (shipping/logistics and traders).

Specifically, the TradeLens platform jointly developed by AP Moller-Maersk and IBM is based on the Collaboration Application Programming Interface (API) concept which ensures that all logistics activities such as haulage, warehousing, shipping and freight forwarding at both, domestic and international levels, can now be wholly integrated.

Notwithstanding, will quantum supremacy which Google had claimed to achieve finally constrain the full potential of blockchain technology? According to Deloitte, someone with an operational quantum computer who has access to the public key (public address) could then falsify the transaction signature known as hashing which is an encryption mechanism (in the form of a cryptographic function) serving as proof of work that is linkable to another block of transaction data (hence forming a blockchain) and therefore hack to gain entry to the private key (i.e., for the purpose of decryption of the signature). Be that as it may, quantum computing could also easily be deployed in blockchain technology to fend off would-be hackers or rogue miners.

Autonomous Driving

And not least, robotic process automation (RPA) is increasingly being used in fintech (financial technology). In its Fintech and Digital Banking 2025 Asia Pacific report, IDC stated that financial liberalisation, drive towards cost-reduction, intense competition from counterparts as well as P2P (peer-to-peer) players, wafer-thin net interest margins, etc. are catalysing banks to further automate, e.g., through RPA software that enables computers to process manual workloads of business processes more efficiently and effectively (such as triggering error-free responses).

Finally, autonomous driving will soon be an in-thing in Malaysia as it is in other parts of the world, not least across the Causeway (in Singapore). Software by eMooVit Technology, a local start-up specialising in driverless agnostic vehicle software for urban environment routes can be used in different applications such as first/last-mile transportation, logistics and utility solutions.

On 23 December last year, eMoovit was reported to be the first company to use Malaysias first self-driving vehicle testing route as announced by Futurise, a wholly-owned subsidiary of technology hub enabler, Cyberview. As reported in the local media, the seven-kilometre Cyberjaya Malaysia Autonomous Vehicle (MyAV) Testing Route was jointly developed by Futurise and the Ministry of Transport (MoT) under the National Regulatory Sandbox (NRS) initiative for the development of autonomous or self-driving vehicles.

Related Articles:

Food Delivery On The Rise In ASEAN

Quantum Computing Is The Future Of Computers

Read the original post:
Malaysia: Leveraging On Digitalisation Trends - The ASEAN Post

The Biggest Science Stories of 2020 | Technically Speaking – Inside Tucson Business

As we enter a new year, were taking time to look back at some of the biggest local science stories that came out of the University of Arizona in 2020. Because theres already so much news about COVID-19, were excluding any pandemic science stories, and instead focusing on research developments coming out of the university.

OSIRIS-REx successfully retrieves asteroid sample. More than four years after launching from Earth, the University of Arizona-led OSIRIS-REx spacecraft captured a sample of an asteroids surface on Oct. 20, 2020. The NASA spacecraft actually arrived at its destination, the asteroid Bennu more than 200 million miles away, in December 2018, but spent nearly two years orbiting and mapping its surface. The OSIRIS-REx team announced several crucial steps leading up to the sample collection. Close-up imaging showed that the asteroids surface was far rockier than originally expected. Scans revealed Bennu is packed with more than 200 boulders larger than 33 feet (10 m) in diameter and many more that are 3 feet (1 m) or larger. This meant the spacecraft only had an area the size of a few parking spots from which to collect the samples. The sample process took more than four hours, with the spacecraft slowly descending 2,500 feet from orbit toward the asteroid. While the spacecraft came in contact with the asteroid, it didnt land. Instead, it extended a robotic arm and fired a jet of pressurized nitrogen to kick up dust and rocks from the asteroids surface. Some of the agitated material was captured in OSIRIS-RExs collector head, and the spacecraft then used thrusters to move away from the asteroid. Scientists believe the spacecraft touched the surface only three feet from where they originally planned. OSIRIS-REx is expected to return the captured dust and rocks to Earth in 2023. With this carbon-rich material, scientists hope to better understand the formation of our early solar system, and even the origins of life on our planet.

Quantum Computing. Three researchers from the University of Arizonas College of Engineering are part of the newly established Superconducting Quantum Materials and Systems Center, led by the U.S. Department of Energy. The $115 million center aims to build a quantum computer and develop quantum sensors that could lead to discoveries about dark matter and other elusive subatomic particles. The involved local researchers are professor of electrical and computer engineering Bane Vasic, assistant professor of materials science and engineering Zheshen Zhang and assistant professor of electrical and computer engineering Quntao Zhuang. Whereas standard computers operate on a binary system of 0s and 1s, quantum computers operate with qubits which can exist as 0 and 1 simultaneously, making them exponentially more powerful. However, this superposition makes quantum computers far less stable. One of the primary goals of the new Center and the local researchers is to increase quantum computers stability. According to Vasic, designing good quantum error correction codes and decoders is arguably the most important theoretical challenge facing practical realizations of quantum-enabled information processing systems. Zhang argues that quantum computing is going to completely transform our current technology and become a driver for the economy. The researchers expect the Center to play a major role in changing the next generation of our workforce.

Personalized Cancer Vaccines. After promising preliminary tests, a study led by UA researcher Dr. Julie Bauman will be expanded to further investigate the safety and effectiveness of a personalized cancer vaccine. Baumans study uses a patients own cancer cells to develop a vaccine intended to teach their immune system how to recognize and destroy cancer cells. This personalized vaccine was used in combination with the immunotherapy drug Pembrolizumab. The preliminary test used both of these treatments on 10 patients with head and neck cancer, seven of whom were treated at Banner University Medicine. According to the study, half of the patients experienced a clinical response to the personalized cancer vaccine, and two patients had no detectable disease present after the treatment. This 50% clinical response is much higher than the approximately 15% response rate in patients who receive Pembrolizumab immunotherapy alone. Moving forward, the study will expand to 40 patients with head and neck cancer. According to UA, to identify the patient-specific mutations of the cancer, mutated DNA from the patients tumor is simultaneously sequenced with healthy DNA from the patients blood. Computers then compare the two DNA samples to identify the unique cancer mutations.

Safer Opioids. Researchers at the UAs College of Medicine have found a way to enhance the effectiveness and presumably decrease the side effects of opioid therapy. While opioids are one of the most effective and common treatments for chronic pain, their dangerous side effects and addictive qualities have caused an epidemic in the US resulting in nearly 50,000 deaths annually. But a potential solution to this high-risk usage was recently found by local researchers, who found that inhibiting the heat shock protein 90 in the spinal cord can improve opioid use. According to researcher John Streicher of the UAs Department of Pharmacology, it seems like heat shock protein 90 is inhibiting one of those pathways in the spinal cord and preventing it from being activated. When we give this inhibitor in the spinal cord, it unblocks that pathway, which provides another route to greater pain relief. The findings suggest that inhibiting heat shock protein 90 could give doctors the opportunity to implement a dose-reduction strategy for patients. Less opioid drugs could be prescribed, but patients would get the same levels of pain relief while experiencing reduced side effects.

Technology in the Brain. Researchers at UA, George Washington University and Northwestern University have created an ultra-small, wireless, battery-free device that uses light to record individual neurons so neuroscientists can see how the brain is working. The goal is to better understand the brain, specifically how individual neurons interact with each other. The process first involves tinting select neurons with a dye that changes in brightness depending on activity. Then, the device shines a light on the dye, making the neurons biochemical processes visible. The device captures the changes using a probe only slightly wider than a human hair, then processes a direct readout of the neurons activity and transmits the information wirelessly to researchers. The devices in use are smaller than an M&M and only one-20th of the weight. They can afford to be so small and flexible because they do not need a battery, instead harvesting energy from external oscillating magnetic fields gathered by a miniature antenna on the device. Ultimately, the technology is planned to help the fight against neurodegenerative diseases such as Alzheimers and Parkinsons, and perhaps even help us better understand the brains biological mechanisms, such as pain and depression.

See the article here:
The Biggest Science Stories of 2020 | Technically Speaking - Inside Tucson Business

Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too – HPCwire

Here on the cusp of the new year, the catchphrase 2020 hindsight has a distinctly different feel. Good riddance, yes. But also proof of sciences power to mobilize and do good when called upon. Theres gratitude by those who came through less scathed, and, maybe more willingness to assist those who didnt.

Despite the unrelenting pandemic, high performance computing (HPC) proved itself an able member of the worldwide community of pandemic fighters. We should celebrate that, perhaps quietly since the work isnt done. HPC made a significant difference in speeding up and enabling vastly distributed research and funneling the results to those who could turn them into patient care, epidemiology guidance, and now vaccines. Remarkable really. Necessary, of course, but actually got done too. (Forget the quarreling; thats who we are.)

Across the Tabor family of publications, weve run more than 200 pandemic-related articles. I counted nearly 70 significant pieces in HPCwire. The early standing up of Fugaku at RIKEN, now comfortably astride the Top500 for a second time and by a significant margin, to participate in COVID-19 research is a good metaphor for HPCs mobilization. Many people and organizations contributed to the HPC v. pandemic effort and that continues.

Before spotlighting a few pandemic-related HPC activities and digging into a few other topics, lets do a speed-drive through the 2020 HPC/AI technology landscape.

Consolidation continued among chip players (Nvidia/Arm, AMD/Xilinx) while the AI chip newcomers (Cerebras, Habana (now Intel), SambaNova, Graphcore et. al.) were winning deals. Nvidias new A100 GPU is amazing and virtually everyone else is taking potshots for just that reason. Suddenly RISC-V looks very promising. Systems makers weathered 2020s storm with varying success while IBM seems to be winding down its HPC focus; it also plans to split/spin off its managed infrastructure services. Firing up Fugaku (notably a non-accelerated system) quickly was remarkable. The planned Frontier (ORNL) supercomputer now has the pole position in the U.S. exascale race ahead of the delayed Aurora (ANL).

The worldwide quantum computing frenzy is in full froth as the U.S. looks for constructive ways to spend its roughly $1.25 billion (U.S. Quantum Initiative) and, impressively, China just issued a demonstration of quantum supremacy. Theres a quiet revolution going on in storage and memory (just ask VAST Data). Nvidia/Mellanox introduced its line of 400 Gbs network devices while Ethernet launched its 800 Gbs spec. HPC-in-the-cloud is now a thing not a soon-to-be thing. AI is no longer an oddity but quickly infusing throughout HPC (That happened fast).

Last but not least, hyperscalers demonstrably rule the IT roost. Chipmakers used to, consistently punching above their weight (sales volume). Not so much now:

Ok then. Apologies for the many important topics omitted (e.g. exascale and leadership systems, neuromorphic tech, software tools (can oneAPI flourish?), newer fabrics, optical interconnect, etc.).

Lets start.

I want to highlight two HPC pandemic-related efforts, one current and one early on, and also single out the efforts of Oliver Peckham, HPCwires editor who leads our pandemic coverage which began in earnest with articles on March 6 (Summit Joins the Fight Against the Coronavirus) and March 13 (Global Supercomputing Is Mobilizing Against COVID-19). Actually, the very first piece Tech Conferences Are Being Canceled Due to Coronavirus, March 3 was more about interrupted technology events and we picked it up from our sister pub, Datanami which ran it on March 2. Weve since become a virtualized event world.

Heres an excerpt from the first Summit piece about modeling COVID-19s notorious spike:

Micholas Smith, a postdoctoral researcher at the University of Tennessee/ORNL Center for Molecular Biophysics (UT/ORNL CMB), used early studies and sequencing of the virus to build a virtual model of the spike protein.[A]fter being granted time on Summit through a discretionary allocation, Smith and his colleagues performed a series of molecular dynamics simulations on the protein, cycling through 8,000 compounds within a few days and analyzing how they bound to the spike protein, if at all.

Using Summit, we ranked these compounds based on a set of criteria related to how likely they were to bind to the S-protein spike, Smith said in aninterviewwith ORNL. In total, the team identified 77 candidate small-molecule compounds (such as medications) that they considered worthy of further experimentation, helping to narrow the field for medical researchers.

It took us a day or two whereas it would have taken months on a normal computer, said Jeremy Smith, director of UT/ORNL CMB and principal researcher for the study. Our results dont mean that we have found a cure or treatment for the Wuhan coronavirus. We are very hopeful, though, that our computational findings will both inform future studies and provide a framework that experimentalists will use to further investigate these compounds. Only then will we know whether any of them exhibit the characteristics needed to mitigate this virus.

The flood (and diversity) of efforts that followed was startling. Olivers advice on what to highlight catches the flavor of the challenge: You could go with something like the Fugaku vs. COVID-19 piece or the grocery store piece, maybe contrast them a bit, earliest vs. current simulations of viral particle spreador something like the LANL retrospective piece vs. the piece I just wrote up on their vaccine modeling. Think that might work for a how far weve come angle, either way.

Theres too much to cover.

Last week we ran Olivers article on LANL efforts to optimize vaccine distribution (At Los Alamos National Lab, Supercomputers Are Optimizing Vaccine Distribution). Heres brief excerpt:

The new vaccines from Pfizer and Moderna have been deemed highly effective by the FDA; unfortunately, doses are likely to be limited for some time. As a result, many state governments are struggling to weigh difficult choices should the most exposed, like frontline workers, be vaccinated first? Or perhaps the most vulnerable, like the elderly and immunocompromised? And after them, whos next?

LANL was no stranger to this kind of analysis: earlier in the year, the lab had used supercomputer-powered tools like EpiCast to simulate virtual cities populated by individuals with demographic characteristics to model how COVID-19 would spread under different conditions. The first thing we looked at was whether it made a difference to prioritize certain populations such as healthcare workers or to just distribute the vaccine randomly,saidSara Del Valle, the LANL computational epidemiologist who is leading the labs COVID-19 modeling efforts. We learned that prioritizing healthcare workers first was more effective in reducing the number of COVID cases and deaths.

You get the idea. The well of HPC efforts to tackle and stymie COVID-19 is extremely deep. Turning unproven mRNA technology into a vaccine in record time was awe-inspiring and required many disciplines. For those unfamiliar with mRNA mechanism heres a brief CDC explanation as it relates to the new vaccines. Below are links to a few HPCwirearticles on the worldwide effort to bring HPC computational power to bear. (The last is a link to the HPCwire COVID-19 Archive which has links to all our major pandemic coverage):

COVID COVERAGE LINKS

Global Supercomputing Is Mobilizing Against COVID-19 (March 12, 2020)

Gordon Bell Special Prize Goes to Massive SARS-CoV-2 Simulations (November 19, 2020)

Supercomputer Research Leads to Human Trial of Potential COVID-19 Therapeutic Raloxifene (October 29, 2020)

AMDs Massive COVID-19 HPC Fund Adds 18 Institutions, 5 Petaflops of Power (September 14, 2020)

Supercomputer-Powered Research Uncovers Signs of Bradykinin Storm That May Explain COVID-19 Symptoms (July 28, 2020)

Researchers Use Frontera to Investigate COVID-19s Insidious Sugar Coating (June 16, 2020)

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects (May 28, 2020)

At SC20, an Expert Panel Braces for the Next Pandemic (December, 17, 2020)

Whats New in Computing vs. COVID-19: Cerebras, Nvidia, OpenMP & More (May 18, 2020)

Billion Molecules Against COVID-19 Challenge to Launch with Massive Supercomputing Support (April 22, 2020)

Pandemic Wipes Out 2020 HPC Market Growth, Flat to 12% Drop Expected (March 31, 2020)

[emailprotected]Turns Its Massive Crowdsourced Computer Network Against COVID-19 (March 16, 2020)

2020 HPCwire Awards Honor a Year of Remarkable COVID-19 Research (December, 23, 2020)

HPCWIRE COVID-19 COVERAGE ARCHIVE

Making sense of the processor world is challenging. Microprocessors are still the workhorses in mainstream computing with Intel retaining its giant market share despite AMDs encroachment. That said, the rise of heterogeneous computing and blended AI/HPC requirements has shifted focus to accelerators. Nvidias A100 GPU (54 billion transistors on 826mm2of silicon, worlds largest seven-nanometer chip) was launched this spring. Then at SC20 Nvidia announced an enhanced version of the A100, doubling its memory to 80GB; it now delivers 2TB/s of bandwidth. The A100 is an impressive piece of work.

The A100s most significant advantage, says Rick Stevens, associate lab director, Argonne National Laboratory, is its multi-instance GPU capability.

For many people the problem is achieving high occupancy, that is, being able to fill the GPU up because that depends on how much work you have to do. [By] introducing this MIG, this multi instance stuff that they have, theyre able to virtualize it. Most of the real-world performance wins are actually kind of throughput wins by using the virtualization. What weve seen isour big performance improvement is not that individual programs run much faster its that we can run up to seven parallel things on each GPU. When you add up the aggregate performance, you get these factors of three to five improvement over the V100, said Stevens.

Meanwhile, Intels XE GPU line is slowly trickling to market, mostly in card form. At SC20 Intel announced plans to make its high performance discrete GPUs available to early access developers. Notably, the new chips have been deployed at ANL and will serve as a transitional development vehicle for the future (2022) Aurora supercomputer, subbing in for the delayed IntelXE-HPC (Ponte Vecchio) GPUs that are the computational backbone of the system.

AMD, also at SC20, launched its latest GPU the MI100. AMD says it delivers 11.5 teraflops peak double-precision (FP64), 46.1 teraflops peak single-precision matrix (FP32), 23.1 teraflops peak single-precision (FP32), 184.6 teraflops peak half-precision (FP16) floating-point performance, and 92.3 peak teraflops of bfloat16 performance. HPCwire reported, AMDs MI100GPU presents a competitive alternative to Nvidias A100 GPU, rated at 9.7 teraflops of peak theoretical performance. However, the A100 is returning even higher performance than that on its FP64 Linpack runs. It will be interesting to see the specs of the GPU AMD eventually fields for use in its exascale system wins.

The stakes are high in what could become a GPU war. Today, Nvidia is the market leader in HPC.

Turning back to CPUs, which many in HPC/AI have begun to regard as the lesser of CPU/GPU pairings. Perhaps that will change with the spectacular showing of Fujitsus A64FX at the heart of Fugaku. Nvidias proposed acquisition of Arm, not a done deal yet (regulatory concerns), would likely inject fresh energy in what was already a surging Arm push into the datacenter. Of course, Nvidia has jumped into the systems business with its DGX line and presumably wants a home-grown CPU. The big mover of the last couple of years, AMDs Epyc microprocessor line, continues its steady incursion into Intel x86 territory.

Theres not been much discussion around Power10 beyond IBMs summer announcement that Power10 would offer a ~3x performance gain and ~2.6x core efficiency gain over Power9. The new executive director of OpenPOWER Foundation, James Kulina, says attracting more chipmakers to build Power devices is a top goal. Well see. RISC-V is definitely drawing interest but exactly how it fits into the processor puzzle is unclear. Esperanto unveiled a RISC-V based chip aimed at machine learning with 1,100 low-power cores based on the open-source RISC-V. Esperanto reported a goal of 4,000 cores on a single device. Europe is betting on RISC-V. However, at least near-term, RISC-V variants are seen as specialized chips.

The CPU waters are murkier than ever.

Sort of off in a land of their own are AI chip/system players. Their proliferation continues with the early movers winning important deployments. Some observers think 2021 will start sifting winners from the losers. Lets not forget that last year Intel stopped development of its newly-acquired Nervana line in favor of its even more newly-acquired Habana products. Its a high-risk, high-reward arena still.

PROCESSOR COVERAGE LINKS

Intel Xe-HP GPU Deployed for Aurora Exascale Development

Is the Nvidia A100 GPU Performance Worth a Hardware Upgrade?

LLNL, ANL and GSK Provide Early Glimpse into Cerebras AI System Performance

David Patterson Kicks Off AI Hardware Summit Championing Domain Specific Chips

Graphcores IPU Tackles Particle Physics, Showcasing Its Potential for Early Adopters

Intel Debuts Cooper Lake Xeons for 4- and 8-Socket Platforms

Intel Launches Stratix 10 NX FPGAs Targeting AI Workloads

Nvidias Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

AMD Launches Three New High-Frequency Epyc SKUs Aimed at Commercial HPC

IBM Debuts Power10; Touts New Memory Scheme, Security, and Inferencing

AMDs Road Ahead: 5nm Epyc, CPU-GPU Coupling, 20% CAGR

AI Newcomer SambaNova GAs Product Lineup and Offers New Service

Japans AIST Benchmarks Intel Optane; Cites Benefit for HPC and AI

Storage and memory dont get the attention they deserve. 3D XPoint memory (Intel and Micron), declining flash costs, and innovative software are transforming this technology segment. Hard disk drives and tape arent going away, but traditional storage management approaches such as tiering based on media type (speed/capacity/cost) are under attack. Newcomers WekaIO, VAST Data, and MemVerge are all-in on solid state, and a few leading-edge adopters (NERSC/Perlmutter) are taking the plunge. Data-intensive computing driven by the data flood and AI compute requirements (gotta keep those GPUs busy!) are big drivers.

Our storage systems typically see over an exabyte of I/O annually. Balancing this I/O intensive workload with the economics of storage means that at NERSC, we live and breathe tiering. And this is a snapshot of the storage hierarchy we have on the floor today at NERSC. Although it makes for a pretty picture, we dont have storage tiering because we want to, and in fact, Id go so far as to say its the opposite of what we and our users really want. Moving data between tiers has nothing to do with scientific discovery, said NERSC storage architect Glenn Lockwood during an SC20 panel.

To put some numbers behind this, last year we did a study that found that between 15% and 30% of that exabyte of I/O is not coming from our users jobs, but instead coming from data movement between storage tiers. That is to say that 15% to 30% of the I/O at NERSC is a complete waste of time in terms of advancing science. But even before that study, we knew that both the changing landscape of storage technology and the emerging large-scale data analysis and AI workloads arriving at NERSC required us to completely rethink our approach to tiered storage, said Lockwood.

Not surprisingly Intel and Micron (Optane/3D XPoint) are trying to accelerate the evolution. Micron released what it calls a heterogeneous-memory storage engine (HSE) designed for solid-state drives, memory-based storage and, ultimately, applications requiring persistent memory. Legacy storage engines born in the era of hard disk drives have historically failed to architecturally provide for the increased performance and reduced latency of next-generation nonvolatile media, said the company. Again, well see.

Software defined storage leveraging newer media has all the momentum at the moment with all of the established players IBM, DDN, Panasas, etc., mixing those capabilities into their product sets. WekaIO and Intel have battled it out for the top IO500 spot the last couple of years and Intels DAOS (distributed asynchronous object store) is slated for use in Aurora.

The concept of asynchronous IO is very interesting, noted Ari Berman, CEO, BioTeam research consultancy. Its essentially a queue mechanism at the system write level so system waits in the processors dont have to happen while a confirmed write back comes from the disks. So asynchronous IO allows jobs can keep running while youre waiting on storage to happen, to a limit of course. That would really improve the data input-output pipelines in those systems. Its a very interesting idea. I like asynchronous data writes and asynchronous storage access. I can see there very easily being corruption that creeps into those types of things and data without very careful sequencing. It will be interesting to watch. If it works it will be a big innovation.

Change is afoot and the storage technology community is adapting. Memory technology is also advancing.

Micron introduced a 176-layer 3D NAND flash memory at SC230 that it says increases read and write densities by more than 35 percent.JEDEC published the DDR5 SDRAM spec, the next-generation standard for random access memory (RAM) in the summer. Compared to DDR4, the DDR5 spec will deliver twice the performance and improved power efficiency, addressing ever-growing demand from datacenter and cloud environments, as well as artificial intelligence and HPC applications. At launch, DDR5 modules will reach 4.8 Gbps, providing a 50 percent improvement versus the previous generation. Density goes up four-fold with maximum density increasing from 16 Gigabits per die to 64 Gigabits per die in the new spec. JEDEC representatives indicated there will be 8 Gb and 16 Gb DDR5 products at launch.

There are always the wildcards. IBMs memristive technology is moving closer to practical use. One outlier is DNA-based storage. Dave Turek, longtime IBMer, joined DNA storage start-up Catalog this year and, says Catalog is working on proof of concepts with government agencies and a number of Fortune 500 companies. Some of these are whos-who HPC players, but some are non-HPC players many names you would recognizeWere at what I would say is the beginning of the commercial beginning. Again, well see.

STORAGE & MEMORY LINKS

SC20 Panel OK, You Hate Storage Tiering. Whats Next Then?

Intels Optane/DAOS Solution Tops Latest IO500

Startup MemVerge on Memory-centric Mission

HPC Strategist Dave Turek Joins DNA Storage (and Computing) Company Catalog

DDN-Tintri Showcases Technology Integration with Two New Products

Intel Refreshes Optane Persistent Memory, Adds New NAND SSDs

Micron Boosts Flash Density with 176-Layer 3D NAND

DDR5 Memory Spec Doubles Data Rate, Quadruples Density

IBM Touts STT MRAM Technology at IDEM 2020

The Distributed File Systems and Object Storage Landscape: Whos Leading?

Its tempting to omit quantum computing this year. Too much happened to summarize easily and the overall feel is of steady carry-on progress from 2019. There was, perhaps, a stronger pivot at least by press release count towards seeking early applications for near-term noisy intermediate scale quantum (NISQ) computers. Ion trap qubit technology got another important player in Honeywell which formally rolled out its effort and first system. Intel also stepped out from the shadows a bit in terms of showcasing its efforts. D-Wave launched a giant 5000-qubit machine (Advantage), again using a quantum annealing approach thats different from universal gate-based quantum system. IBM announced a stretch goal of achieving one million qubits!

Calling quantum computing a market is probably premature but monies are being spent. The Quantum Economic Development Consortium (QED-C) and Hyperion Research issued a forecast (see slide) that projects the global quantum computing (QC) market worth an estimated $320 million in 2020 to grow 27% CAGR between 2020 and 2024. That would reach approximately $830 million by 2024. Chump change? Perhaps but real activity.

IBMs proposed Quantum Volume metric has drawn support as a broad benchmark of quantum computer performance. Honeywell promoted the 128QV score of its launch system. In December IBM reported it too had achieved a 128QV. The first QV reported by IBM was 16 in 2019 at the APS March meeting. Just what a QV of 128 means in determining practical usefulness is unclear but it is steady progress and even Intel agrees that QV is as good as any measure at the moment. DoE is also working on benchmarks, focusing a bit more on performance on given workloads.

[One] major component of benchmarking is asking what kind of resources does it take to run this or that interesting problem. Again, these are problems of interest to DoE, so basic science problems in chemistry and nuclear physics and things like that. What well do is take applications in chemistry and nuclear physics and convert them into what we consider a benchmark. We consider it a benchmark when we can distill a metric from it. So the metric could be the accuracy, the quality of the solution, or the resources required to get a given level of quality, said Raphael Pooser, PI for DoEs Quantum Testbed Pathfinder project at ORNL, during an HPCwire interview.

Next year seems likely to bring more benchmarking activity around system quality, qubit technology, and performance on specific problem sets. Several qubit technologies still vie for sway superconducting, trapped ion, optical, quantum dots, cold atoms, et al. The need to operate at near-zero (K) temps complicates everything. Google claimed achieving Quantum Supremacy last year. This year a group of China researchers also did so. The groups used different qubit technologies (superconducting v. optical) and Chinas effort tried to skirt criticisms that were lobbed at Googles effort. Frankly, both efforts were impressive. Russia reported early last year it would invest $790 million in quantum with achieving quantum supremacy as one goal.

Whats happening now is a kind of pell-mell rush among a larger and increasingly diverse quantum ecosystem (hardware, software, consultants, governments, academia). Fault tolerant quantum computing still seems distant but clever algorithms and error mitigation strategies to make product use of NISQ systems, likely on narrow applications, look more and more promising.

Here are a few snapshots:

The persistent question is when will all of these efforts pay off and will they be as game-changing as many believe. With new money flowing into quantum, one has the sense there will be few abrupt changes in the next couple years barring untoward economic turns.

QUANTUM COVERAGE LINKS

IBMs Quantum Race to One Million Qubits

Googles Quantum Chemistry Simulation Suggests Promising Path Forward

Intel Connects the (Quantum) Dots in Accelerating Quantum Computing Effort

D-Wave Delivers 5000-qubit System; Targets Quantum Advantage

Honeywell Debuts Quantum System, Subscription Business Model, and Glimpse of Roadmap

Global QC Market Projected to Grow to More Than $800 million by 2024

ORNLs Raphael Pooser on DoEs Quantum Testbed Project

RigettiComputing Wins $8.6M DARPA Grant to Demonstrate Practical Quantum Computing

Braket: Amazons Cloud-First Quantum Environment Is Generally Available

IBM-led Webinar Tackles Quantum Developer Community Needs

Microsofts Azure Quantum Platform Now Offers Toshibas Simulated Bifurcation Machine

As always theres personnel shuffling. Lately hyperscalers have been taking HPC folks. Two long-time Intel executives, Debra Goldfarb and Bill Magro, recently left for the cloud Goldfarb to AWS as director for HPC products and strategy, and Magro to Google as CTO for HPC. Going in the other direction, John Martinis left Googles quantum development team and recently joined Australian start-up Silicon Quantum Computing. Ginny Rometty, of course, stepped down as CEO and chairman at IBM. IBMs long-time HPC exec Dave Turek left to take position with DNA storage start-up, Catalog, and last January, IBMer Brad McCredie joined AMD as corporate VP, GPU platforms.

See the original post:
Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too - HPCwire

These Were Our Favorite Tech Stories From Around the Web in 2020 – Singularity Hub

This time last year we were commemorating the end of a decade and looking ahead to the next one. Enter the year that felt like a decade all by itself: 2020. News written in January, the before-times, feels hopelessly out of touch with all that came after. Stories published in the early days of the pandemic are, for the most part, similarly naive.

The years news cycle was swift and brutal, ping-ponging from pandemic to extreme social and political tension, whipsawing economies, and natural disasters. Hope. Despair. Loneliness. Grief. Grit. More hope. Another lockdown. Its been a hell of a year.

Though 2020 was dominated by big, hairy societal change, science and technology took significant steps forward. Researchers singularly focused on the pandemic and collaborated on solutions to a degree never before seen. New technologies converged to deliver vaccines in record time. The dark side of tech, from biased algorithms to the threat of omnipresent surveillance and corporate control of artificial intelligence, continued to rear its head.

Meanwhile, AI showed uncanny command of language, joined Reddit threads, and made inroads into some of sciences grandest challenges. Mars rockets flew for the first time, and a private company delivered astronauts to the International Space Station. Deprived of night life, concerts, and festivals, millions traveled to virtual worlds instead. Anonymous jet packs flew over LA. Mysterious monoliths appeared and disappeared worldwide.

It was all, you know, very 2020. For this years (in-no-way-all-encompassing) list of fascinating stories in tech and science, we tried to select those that werent totally dated by the news, but rose above it in some way. So, without further ado: This years picks.

How Science Beat the VirusEd Yong | The AtlanticMuch like famous initiatives such as the Manhattan Project and the Apollo program, epidemics focus the energies of large groups of scientists. But nothing in history was even close to the level of pivoting thats happening right now, Madhukar Pai of McGill University told me. No other disease has been scrutinized so intensely, by so much combined intellect, in so brief a time.

It Will Change Everything: DeepMinds AI Makes Gigantic Leap in Solving Protein StructuresEwen Callaway | NatureIn some cases, AlphaFolds structure predictions were indistinguishable from those determined using gold standard experimental methods such as X-ray crystallography and, in recent years, cryo-electron microscopy (cryo-EM). AlphaFold might not obviate the need for these laborious and expensive methodsyetsay scientists, but the AI will make it possible to study living things in new ways.

OpenAIs Latest Breakthrough Is Astonishingly Powerful, But Still Fighting Its FlawsJames Vincent | The VergeWhat makes GPT-3 amazing, they say, is not that it can tell you that the capital of Paraguay is Asuncin (it is) or that 466 times 23.5 is 10,987 (its not), but that its capable of answering both questions and many more beside simply because it was trained on more data for longer than other programs. If theres one thing we know that the world is creating more and more of, its data and computing power, which means GPT-3s descendants are only going to get more clever.

Artificial General Intelligence: Are We Close, and Does It Even Make Sense to Try?Will Douglas Heaven | MIT Technology ReviewA machine that could think like a person has been the guiding vision of AI research since the earliest daysand remains its most divisive idea. So why is AGI controversial? Why does it matter? And is it a reckless, misleading dreamor the ultimate goal?

The Dark Side of Big Techs Funding for AI ResearchTom Simonite | WiredTimnit Gebrus exit from Google is a powerful reminder of how thoroughly companies dominate the field, with the biggest computers and the most resources. [Meredith] Whittaker of AI Now says properly probing the societal effects of AI is fundamentally incompatible with corporate labs. That kind of research that looks at the power and politics of AI is and must be inherently adversarial to the firms that are profiting from this technology.i

Were Not Prepared for the End of Moores LawDavid Rotman | MIT Technology ReviewQuantum computing, carbon nanotube transistors, even spintronics, are enticing possibilitiesbut none are obvious replacements for the promise that Gordon Moore first saw in a simple integrated circuit. We need the research investments now to find out, though. Because one prediction is pretty much certain to come true: were always going to want more computing power.

Inside the Race to Build the Best Quantum Computer on EarthGideon Lichfield | MIT Technology ReviewRegardless of whether you agree with Googles position [on quantum supremacy] or IBMs, the next goal is clear, Oliver says: to build a quantum computer that can do something useful. The trouble is that its nearly impossible to predict what the first useful task will be, or how big a computer will be needed to perform it.

The Secretive Company That Might End Privacy as We Know ItKashmir Hill | The New York TimesSearching someone by face could become as easy as Googling a name. Strangers would be able to listen in on sensitive conversations, take photos of the participants and know personal secrets. Someone walking down the street would be immediately identifiableand his or her home address would be only a few clicks away. It would herald the end of public anonymity.

Wrongfully Accused by an AlgorithmKashmir Hill | The New York TimesMr. Williams knew that he had not committed the crime in question. What he could not have known, as he sat in the interrogation room, is that his case may be the first known account of an American being wrongfully arrested based on a flawed match from a facial recognition algorithm, according to experts on technology and the law.

Predictive Policing Algorithms Are Racist. They Need to Be Dismantled.Will Douglas Heaven | MIT Technology ReviewA number of studies have shown that these tools perpetuate systemic racism, and yet we still know very little about how they work, who is using them, and for what purpose. All of this needs to change before a proper reckoning can take pace. Luckily, the tide may be turning.

The Panopticon Is Already HereRoss Andersen | The AtlanticArtificial intelligence has applications in nearly every human domain, from the instant translation of spoken language to early viral-outbreak detection. But Xi [Jinping] also wants to use AIs awesome analytical powers to push China to the cutting edge of surveillance. He wants to build an all-seeing digital system of social control, patrolled by precog algorithms that identify potential dissenters in real time.

The Case For Cities That Arent Dystopian Surveillance StatesCory Doctorow | The GuardianImagine a human-centered smart city that knows everything it can about things. It knows how many seats are free on every bus, it knows how busy every road is, it knows where there are short-hire bikes available and where there are potholes. What it doesnt know isanything about individuals in the city.

The Modern World Has Finally Become Too Complex for Any of Us to UnderstandTim Maughan | OneZeroOne of the dominant themes of the last few years is that nothing makes sense. I am here to tell you that the reason so much of the world seems incomprehensible is that itisincomprehensible. From social media to the global economy to supply chains, our lives rest precariously on systems that have become so complex, and we have yielded so much of it to technologies and autonomous actors that no one totally comprehends it all.

The Conscience of Silicon ValleyZach Baron | GQWhat I really hoped to do, I said, was to talk about the future and how to live in it. This year feels like a crossroads; I do not need to explain what I mean by this. I want to destroy my computer, through which I now work and have drinks and stare at blurry simulations of my parents sometimes; I want to kneel down and pray to it like a god. I want someoneI want Jaron Lanierto tell me where were going, and whether its going to be okay when we get there. Lanier just nodded. All right, then.

Yes to Tech Optimism. And Pessimism.Shira Ovide | The New York TimesTechnology is not something that exists in a bubble; it is a phenomenon that changes how we live or how our world works in ways that help and hurt. That calls for more humility and bridges across the optimism-pessimism divide from people who make technology, those of us who write about it, government officials and the public. We need to think on the bright side. And we need to consider the horribles.

How Afrofuturism Can Help the World MendC. Brandon Ogbunu | Wired[W. E. B. DuBois] The Comet helped lay the foundation for a paradigm known as Afrofuturism. A century later, as a comet carrying disease and social unrest has upended the world, Afrofuturism may be more relevant than ever. Its vision can help guide us out of the rubble, and help us to consider universes of better alternatives.

Wikipedia Is the Last Best Place on the InternetRichard Cooke | WiredMore than an encyclopedia, Wikipedia has become a community, a library, a constitution, an experiment, a political manifestothe closest thing there is to an online public square. It is one of the few remaining places that retains the faintly utopian glow of the early World Wide Web.

Can Genetic Engineering Bring Back the American Chestnut?Gabriel Popkin | The New York Times MagazineThe geneticists research forces conservationists to confront, in a new and sometimes discomfiting way, the prospect that repairing the natural world does not necessarily mean returning to an unblemished Eden. It may instead mean embracing a role that weve already assumed: engineers of everything, including nature.

At the Limits of ThoughtDavid C. Krakauer | AeonA schism is emerging in the scientific enterprise. On the one side is the human mind, the source of every story, theory, and explanation that our species holds dear. On the other stand the machines, whose algorithms possess astonishing predictive power but whose inner workings remain radically opaque to human observers.

Is the Internet Conscious? If It Were, How Would We Know?Meghan OGieblyn | WiredDoes the internetbehavelike a creature with an internal life? Does it manifest the fruits of consciousness? There are certainly moments when it seems to. Google can anticipate what youre going to type before you fully articulate it to yourself. Facebook ads can intuit that a woman is pregnant before she tells her family and friends. It is easy, in such moments, to conclude that youre in the presence of another mindthough given the human tendency to anthropomorphize, we should be wary of quick conclusions.

The Internet Is an Amnesia MachineSimon Pitt | OneZeroThere was a time when I didnt knowwhat a Baby Yoda was. Then there was a time I couldnt go online without reading about Baby Yoda. And now, Baby Yoda is a distant, shrugging memory. Soon there will be a generation of people who missed the whole thing and for whom Baby Yoda is as meaningless as it was for me a year ago.

Digital Pregnancy Tests Are Almost as Powerful as the Original IBM PCTom Warren | The VergeEach test, which costs less than $5, includes a processor, RAM, a button cell battery, and a tiny LCD screen to display the result. Foone speculates that this device is probably faster at number crunching and basic I/O than the CPU used in the original IBM PC. IBMs original PC was based on Intels 8088 microprocessor, an 8-bit chip that operated at 5Mhz. The difference here is that this is a pregnancy test you pee on and then throw away.

The Party Goes on in Massive Online WorldsCecilia DAnastasio | WiredWere more stand-outside types than the types to cast a flashy glamour spell and chat up the nearest cat girl. But, hey, itsFinal Fantasy XIVonline, and where my body sat in New York, the epicenter ofAmericas Covid-19 outbreak, there certainly werent any parties.

The Facebook Groups Where People Pretend the Pandemic Isnt HappeningKaitlyn Tiffany | The AtlanticLosing track of a friend in a packed bar or screaming to be heard over a live band is not something thats happening much in the real world at the moment, but it happens all the time in the 2,100-person Facebook group a group where we all pretend were in the same venue. So does losing shoes and Juul pods, and shouting matches over which bands are the saddest, and therefore the greatest.

Did You Fly a Jetpack Over Los Angeles This Weekend? Because the FBI Is Looking for YouTom McKay | GizmodoDid you fly a jetpack over Los Angeles at approximately 3,000 feet on Sunday? Some kind of tiny helicopter? Maybe a lawn chair with balloons tied to it? If the answer to any of the above questions is yes, you should probably lay low for a while (by which I mean cool it on the single-occupant flying machine). Thats because passing airline pilots spotted you, and now its this whole thing with the FBI and the Federal Aviation Administration, both of which are investigating.

Image Credit: Thomas Kinto / Unsplash

View original post here:
These Were Our Favorite Tech Stories From Around the Web in 2020 - Singularity Hub

Quantum computers’ power will remake competition in industries from technology to finance – MarketWatch

Quantum computers, once fully scaled, could lead to breakthroughs on many fronts medicine, finance, architecture, logistics.

First, its important to understand why quantum computers are superior to the conventional ones weve been using for years:

In conventional electronic devices, memory consists of bits with only one value, either 0 or 1. In quantum computing, a quantum bit (qubit) exhibits both values in varying degrees at the same time. This is called quantum superposition. These ubiquitous states of each qubit are then used in complex calculations, which read like regular bits: 0 and 1.

Since qubits can store more information than regular bits, this also means quantum computers are capable of processing greater quantities of information. Having four bits enables 16 possibilities, but only one at a time. Four qubits in quantum superposition, however, let you calculate all 16 states at once. This means that four qubits equal 65,500 regular bits. Each qubit added to the quantum computing system increases its power exponentially.

To put things in perspective, a top supercomputer can currently accomplish as much as a five- to 20-qubit computer, but its estimated that a 50-qubit quantum computer will be able to solve computational problems no other conventional device can in any feasible amount of time.

This quantum supremacy has been achieved many times so far. Its important to mention that this doesnt mean the quantum computer can beat a traditional one in every task rather, it shines only in a limited set of tasks specially tailored to outline its strengths. Also, a quantum computer still needs to overcome many obstacles before it can become a mainstream device.

But once it does, its computational power will boost science and industries that profit from it.

Large companies working on quantum computing in their respective industries include AT&T T, +0.75%, Google holding company Alphabet GOOG, +1.33% GOOGL, +1.22%, IBM IBM, +1.49% and Microsoft MSFT, +0.57%.

Here are a few industries that could benefit the most:

Quantum chemistry, also called molecular quantum mechanics, is a branch of chemistry focused on the application of quantum mechanics to chemical systems. Here, quantum computers help in molecule modeling, taking into account all of their possible quantum states a feat that is beyond the power of conventional computing.

That, in turn, helps us understand their properties, which is invaluable for new material and medicine research.

Quantum cryptography, also known as quantum encryption, employs principles of quantum mechanics to facilitate encryption and protection of encrypted data from tampering. Using the peculiar behavior of subatomic particles, it enables the reliable detection of tampering or eavesdropping (via the Quantum Key Distribution method).

Quantum encryption is also used for secure encryption key transfer, which is based on the entanglement principle. Both methods are currently available, but due to their complexity and price, only governments and institutions handling delicate data (most notably in China and the U.S.) can afford them for the time being.

Quantum financeis an interdisciplinary research field that applies theories and methods developed by quantum physicists and economists to solve problems in finance.This especially includes complex calculations, such as the pricing of various financial instruments and other computational finance problems.

Some scientists argue that quantum pricing models will provide more accuracy than classical ones because theyre able to take into account market inefficiency, which is something classical models disregard.

Quantum computing will also enhance analysis of large and unstructured data sets, which will improve decision making across different areas from better-timed offers to risk assessment. Many of these calculations will require a quantum computer with thousands of qubits to resolve, but the way things have been progressing recently, its not unrealistic to see quantum computers reach this processing potential in a matter of years, rather than decades.

Although still in the domain of conceptual research, principles of quantum mechanics will help quantum computers achieve a markedly greater speed and efficiency than what is currently possible on classical computers when executing AI algorithms this goes especially for machine learning.

Current computational models used in weather forecasting employ dynamic variables, from air temperature, pressure and density to historic data and other factors that go into creating climate prediction models. Due to limited available processing power, classical computers and even conventional supercomputers are the bottlenecks that limit the speed and efficacy of forecasting calculations.

To predict extreme weather events and limit the loss of life and property, we need faster and more robust forecasting models. By harnessing the power of qubits, quantum computing is capable of providing necessary the raw processing power to make that happen. Furthermore, machine learning provided by the quantum AI can additionally improve these forecasting models.

Despite its rapid progress, quantum computing is still in its infancy, but its clearly a game changer, capable of solving problems previously deemed insurmountable for classical computers.

This power will provide most benefits not only to science and medicine, but also to businesses and industries where fast processing of large datasets is paramount.

As a marketing specialist, I can see a huge advantage for my industry, but others, especially finance and cryptography, will undoubtedly find the quantum boost to their decision-making processes and quality of their final product hugely beneficial.

The real question is who will be the first to harness this power and use quantum computing as a part of their unique value proposition and competitive advantage? The race is on.

Read the rest here:
Quantum computers' power will remake competition in industries from technology to finance - MarketWatch