Archive for the ‘Quantum Computing’ Category

Six faculty elected to National Academy of Sciences – Stanford Today – Stanford University News

Six Stanford University researchers are among the 120 newly elected members of the National Academy of Sciences. Scientists are elected to the NAS by their peers.

The six Stanford faculty members newly elected to the National Academy of Sciences. (Image credit: Andrew Brodhead)

The new members from Stanford are Savas Dimopoulos, the Hamamoto Family Professor and professor of physics in the School of Humanities and Sciences; Daniel Freedman, a visiting professor at theStanford Institute for Theoretical Physics (SITP) and professor of applied mathematics and theoretical physics, emeritus, at MIT; Judith Frydman, professor of biology and the Donald Kennedy Chair in the School of Humanities and Sciences, and professor of genetics in the Stanford School of Medicine; Kathryn A. Kam Moler, vice provost and dean of research, and the Marvin Chodorow Professor and professor of applied physics and of physics in the School of Humanities and Sciences; Tirin Moore, professor of neurobiology in the Stanford School of Medicine; and John Rickford, professor of linguistics and the J.E. Wallace Sterling Professor in the Humanities, emeritus, in the School of Humanities and Sciences.

Savas Dimopoulos collaborates on a number of experiments that use the dramatic advances in atom interferometry to do fundamental physics. These include testing Einsteins theory of general relativity to fifteen decimal precision, atom neutrality to thirty decimals, and looking for modifications of quantum mechanics. He is also designing an atom-interferometric gravity-wave detector that will allow us to look at the universe with gravity waves instead of light.

Daniel Freedmans research is in quantum field theory, quantum gravity and string theory with an emphasis on the role of supersymmetry. Freedman, along with physicists Sergio Ferrara and Peter van Nieuwenhuizen, developed the theory of supergravity. A combination of the principles of supersymmetry and general relatively, supergravity is a deeply influential blueprint for unifying all of natures fundamental interactions.

Judith Frydman uses a multidisciplinary approach to address fundamental questions about protein folding and degradation, and molecular chaperones, which help facilitate protein folding. In addition, this work aims to define how impairment of cellular folding and quality control are linked to disease, including cancer and neurodegenerative diseases, and examine whether reengineering chaperone networks can provide therapeutic strategies.

Kam Molers research involves developing new tools to measure magnetic properties of quantum materials and devices on micron length-scales. These tools can then be used to investigate fundamental materials physics, superconducting devices and exotic Josephson effects a phenomenon in superconductors that shows promise for quantum computing.

Tirin Moore studies the activity of single neurons and populations of neurons in areas of the brain that relate to visual and motor functions. His lab explores the consequences of changes in that activity and aims to develop innovative approaches to fundamental problems in systems and circuit-level neuroscience.

John Rickfords research and teaching are focused on sociolinguistics the relation between linguistic variation and change and social structure. He is especially interested in the relation between language and ethnicity, social class and style, language variation and change, pidgin and creole languages, African American Vernacular English, and the applications of linguistics to educational problems.

The academy is a private, nonprofit institution that was created in 1863 to advise the nation on issues related to science and technology. Scholars are elected in recognition of their outstanding contributions to research. This years election brings the total of active academy members to 2,461.

See the rest here:
Six faculty elected to National Academy of Sciences - Stanford Today - Stanford University News

Parity-preserving and magnetic fieldresilient superconductivity in InSb nanowires with Sn shells – Science Magazine

Move aside, aluminum

Some of the most promising schemes for quantum information processing involve superconductors. In addition to the established superconducting qubits, topological qubits may one day be realized in semiconductor-superconductor heterostructures. The superconductor most widely used in this context is aluminum, in which processes that cause decoherence are suppressed. Pendharkar et al. go beyond this paradigm to show that superconducting tin can be used in place of aluminum (see the Perspective by Fatemi and Devoret). The authors grew nanowires of indium antimonide, which is a semiconductor, and coated them with a thin layer of tin without using cumbersome epitaxial growth techniques. This process creates a well-defined, hard superconducting gap in the nanowires, which is a prerequisite for using them as the basis for a potential topological qubit.

Science, this issue p. 508; see also p. 464

Improving materials used to make qubits is crucial to further progress in quantum information processing. Of particular interest are semiconductor-superconductor heterostructures that are expected to form the basis of topological quantum computing. We grew semiconductor indium antimonide nanowires that were coated with shells of tin of uniform thickness. No interdiffusion was observed at the interface between Sn and InSb. Tunnel junctions were prepared by in situ shadowing. Despite the lack of lattice matching between Sn and InSb, a 15-nanometer-thick shell of tin was found to induce a hard superconducting gap, with superconductivity persisting in magnetic field up to 4 teslas. A small island of Sn-InSb exhibits the two-electron charging effect. These findings suggest a less restrictive approach to fabricating superconducting and topological quantum circuits.

Originally posted here:
Parity-preserving and magnetic fieldresilient superconductivity in InSb nanowires with Sn shells - Science Magazine

Selected to Build New Supercomputer for the National Supercomputing Centre Singapore – HPCwire

HOUSTON, April 27, 2021 Hewlett Packard Enterprisetoday announced it has been awarded $40M SGD to build a new supercomputer for the National Supercomputing Centre (NSCC) Singapore, the national high-performance computing (HPC) resource center dedicated to supporting science and engineering computing needs for academic, research and industry communities. The new system, which will be 8X faster compared to NSCCs existing pool of HPC resources, will expand and augment ongoing research efforts by enabling tools such as artificial intelligence (AI) and deep machine learning to optimize modeling, simulation and even software simulation for quantum computing. NSCC will use the system to unlock scientific discoveries across medicine, diseases, climate, engineering and more.

The new supercomputer was funded through a SGD200 million investment that was announced by the Singapore government in March 2019 to boost Singapores high-performance computing resources.

Fueling a new supercomputing journey at the National Supercomputing Centre Singapore

The NSCCs new supercomputer will be built and powered using theHPE Cray EX supercomputer, which is an HPC system designed to support next-generation supercomputing, such as Exascale-class systems, that also features a full stack of purpose-built technologies across compute, software, storage and networking to harness insights from vast, complex data more quickly and efficiently. The advanced performance will help tackle compute and data-intensive modeling and simulation needs requiring higher speed and targeted HPC and artificial intelligence capabilities.

The new system will be housed in a new data center designed to increase sustainability and reduce energy consumption. To further support NSCCs mission for a greener data center, the new system will leverage liquid-cooling capabilities made possible through the HPE Cray EX supercomputer to increase energy efficiency and power density by transferring heat generated by the new supercomputer with a liquid-cooled process.

The combination of these advanced technologies will enable the NSCCs existing community of researchers and scientists further their R&D efforts to make breakthroughs in a range of areas, some of which include:

We are inspired by how Singapores community of scientists have leveraged high performance computing to improve ongoing research efforts. We are honored to continue empowering their mission by building them a powerful system using the HPE Cray EX supercomputer that delivers comprehensive, purposely-engineered technologies for demanding research, said Bill Mannel, vice president and general manager, HPE. The new system will deliver a significant boost to R&D, allowing Singapores community of scientists and engineers to make greater contributions that will unlock innovation, economic value, and overall, strengthen the nations position in becoming more digitally-driven.

Supercomputers have enabled the scientific community in Singapore to make significant strides in their research, said Associate Professor Tan Tin Wee, Chief Executive at the National Supercomputing Centre (NSCC) Singapore. The new system will provide the necessary resources to meet the growing supercomputing needs of our researchers, and to enable more of such significant scientific breakthroughs at the national and global level.

The NSCCs supercomputer unlocks new level of scientific discovery with advanced technologies

The HPE Cray EX supercomputer powering NSCCs new supercomputer is a purpose-built system designed specifically to deliver petaflop to exaflop performance with the worlds most energy-efficient footprint. It also includes the HPE Cray EX software stack for software-defined capabilities that allow the NSCCs users to gain the high-performance of a supercomputer, but through the operational experience of a cloud. Additionally, HPE will integrate the following next-generation technologies with the HPE Cray EX supercomputer:

The new system will be operational in early 2022. To learn more about NSCC and Singapores national HPC resources, please visitwww.nscc.sg

About Hewlett Packard Enterprise

Hewlett Packard Enterprise (NYSE: HPE) is the global edge-to-cloud platform as-a-service company that helps organizations accelerate outcomes by unlocking value from all of their data, everywhere. Built on decades of reimagining the future and innovating to advance the way people live and work, HPE delivers unique, open and intelligent technology solutions, with a consistent experience across all clouds and edges, to help customers develop new business models, engage in new ways, and increase operational performance. For more information, visit:www.hpe.com.

Source: HPE

Read the original here:
Selected to Build New Supercomputer for the National Supercomputing Centre Singapore - HPCwire

Quantum computers, here but not here what businesses need to know – Verdict

The future may see us living on Mars, paying for everything with crypto, and relaxing or working as we travel effortlessly about in our driverless cars. But theres an even bigger change coming for many of us, and thats the gradual advent of quantum computing (QC) and what it means for the world of business.

People in the tech business are used to hearing about quantum computing, because its effects as and when it can be delivered at scale will be so gigantic. At the same time it tends to get put in the same folder as fusion power or directed energy weapons, technologies that have been perpetually five years away for many decades.

This long-established position, for many readers, may have obscured the new reality: Quantum computing is actually here in the real world nowadays, albeit on a small scale. Its in use right now by businesses such as IBM and Amazon. This month EU/US audiences (April 21st) and those in APAC (April 22nd) can learn all about the new state of play at a free-to-attend webinar with GlobalData analysts, focused on the real-world business landscape rather than academic theory.

That academic theory of QC is usually explained by saying that where a normal computer operates using bits of information, a quantum computer uses quantum bits or qubits. A normal bit is 1 or 0, on or off: a qubit is much more complicated. When it is measured it will be either 1 or 0; before that, it exists in a quantum superposition of those two states. The quantum superposition is usually described using complex numbers, mathematics based on the so-called imaginary unit, the square root of minus one.

Another way of visualising this is that normal bits are like coins lying on a table. They are either heads or tails up: they can be flipped over. A qubit, however, is like a coin spinning in the air. It can interact with other spinning coins, affecting how they spin, but none of them are heads or tails up until the quantum operations are complete.

Theoreticians can describe what qubits will do in a network of quantum logic gates, even if they dont have any actual machinery capable of carrying out the process. As a result, algorithms can be, and have been, developed for QC machinery even before there was any rather in the way that Ada Lovelace famously wrote some of the first conventional computer programs for Charles Babbages proposed 19th-century mechanical computer, the Analytical Engine, even though it was never actually built.

Thus we know many of the things that QC could achieve. Its effects, when it becomes available at appropriate scale, will be enormous. Quantum computers will find a use anywhere there is a large and complicated problem to be solved. That could be anything from predicting the financial markets, to improving weather forecasts, to cracking encryption systems.

Privacy advocates already fear that quantum computing could one day crack todays secure encryption and the many things built on it. Those with a stake in cryptocurrency may naturally be concerned, according to GlobalData analyst Sam Holt.

Bitcoin and other cryptos use an elliptic curve signature scheme where public and private encryption keys are used to verify transactions, Holt explains to Verdict. Older signature tech doesnt hash (fingerprint) the public key and this can therefore be known by anyone. Around 25% of bitcoins are stored using this older tech, and are vulnerable. At the moment, it remains difficult for bad actors to find out the private key. As early as 2027, however, quantum computers could be at the point where they could use the public key to break the encryption.

It could take only one quantum-crypto-heist for investors to lose confidence.

Before this happens though, fellow GlobalData analyst Mike Orme forecasts post-quantum cryptography (PQC) will have been developed using classical computers.

It wont take quantum computers to develop PQC (so) there doesnt seem to be a case for dumping Bitcoin, Orme believes. But there is a case for governments and enterprises to think seriously about shifting out of current RSA-encrypted systems.

Quantum computings capacity for number crunching may make it a lucrative option when it comes to cryptocurrency mining but its not yet at a suitable stage. Todays most advanced mining technology is extremely fast compared to the current clock speed of what quantum computers can offer now or in the short term, and its likely to stay that way for the next decade at least.

For a quantum computer to work in many of the applications which have already been worked out for it, it would need hundreds of thousands, even millions of qubits. The highest we can manage today is around a hundred. The process of a qubit calculation is so sensitive, that the apparatus around it has to block out various forms of interference, especially that of heat. The supply chain for this kind of tech cant yet be called a chain, and expertise is scarce.

But there is nonetheless already a QC market. GlobalDatas recent thematic report on quantum computing notes the QC market size in 2020 to have been somewhere in the range of $80m-$500m (the exact figure is hard to pin down).

Where is this money coming from? One source is Canadian QC company D-Wave, which has been selling quasi-quantum computers since 2011 for $20m each, notably to US national labs. These computers are based on the quantum annealing method, meaning they are suited to solving optimisation problems, but incapable of handling more advanced algorithms and problems.

Most revenue in quantum computing lies in cloud-based quantum service businesses from IBM, Google, Microsoft, Alibaba, Amazon and others. These Quantum-as-a-Service (QaaS) providers rent time on prototype quantum processors and simulators, often built using conventional compute power, to the rapidly swelling band of researchers and developers from government, major corporates and start-ups navigating through the quantum world.

These developers know there is money to be made on the software and application side, especially when it comes to algorithms. While it will be years until fully-fledged versions of quantum algorithms can be run on full-size quantum computers, there is scope to develop algorithms for intermediate-scale quantum devices in areas such as logistics optimisation. Such algorithms are likely to work in hybrid systems where some qubits are combined with classical computers in the next five years. Quantum simulators meanwhile, which essentially mimic quantum computers but run on classical computers, are becoming increasingly popular as a way of testing quantum computation without the need for an actual quantum computer.

The last few years have seen some road tests of quantum power, literally: a reduction of car waiting times by 20% in a large-scale traffic simulation, for example. This was achieved by Microsoft in partnership with Toyota Tshuso and Jij, a Japanese quantum algorithm start-up. Algorithms based on a realistic QC model were run on classical computers to reduce the waiting time for drivers at red lights, saving about five seconds on average for each car. In 2019, Volkswagen and D-Wave optimised routes in real-time for a fleet of municipal buses running between stops in Lisbon, considering potential traffic jams and passenger numbers. While hardware development in QC may be stuck in a metaphorical traffic jam, its a different story for QC software.

If youd like to find out more about real-world quantum computing, you can register for GlobalDatas free-to-attend Quantum Computing webinar on 21st April 2021 at 4pm (BST). APAC audiences will find a more suitably scheduled session on the 22nd; sign up free here. These expert-led sessions will explore the risks facing QC investors, and why and when quantum computing will change the game for business.

More here:
Quantum computers, here but not here what businesses need to know - Verdict

D-Wave Government Sponsors a Quantum Academy at the Cyber Bytes Foundation to Accelerate US Government’s Adoption of Practical Quantum Computing -…

PALO ALTO, Calif. and STAFFORD, Va., April 15, 2021 (GLOBE NEWSWIRE) -- D-Wave Government Inc., a subsidiary providing D-Waves quantum computing technology, software, services, and expertise to the U.S. government, and Cyber Bytes Foundation (CBF), a non-profit producing education, innovation, and outreach programs responsive to national security challenges, today announced they will work together to host a quantum academy.

Quantum computings importance to our national security remains a focus for the federal government. In the FY21 NDAA, the Service Secretaries were tasked with providing an annual list of technical problems and research challenges likely to be addressable by quantum computers and available for use within the next one to three years. The CBF Quantum Academy, sponsored by D-Wave Government Inc., is aimed at providing strategic guidance, training, and education to ensure the government understands how to best harness the powerful and complex technology. Together, the two organizations are hosting four academy events this year. These events will be available to attend in a virtual capacity in the short term, with the potential to attend in-person at the Quantico Cyber Hub in Stafford, VA later in the year. The following topics will be covered:

Expanding quantum computing access and understanding of the capabilities of todays technology within the U.S. government is critical to the technologys continued maturation and role in the public sector. Cyber Bytes Foundations education and outreach expertise complements D-Wave Governments quantum leadership in this shared endeavor, said Alan Baratz, CEO of D-Wave. Practical quantum computing is capable of delivering value and tackling complex problems that matter to the U.S. government today from public safety planning to autonomous vehicle routing. We look forward to working with CBF to help the government better understand how to harness quantum computing.

Quantum computing represents the next technology revolution. However, there is a knowledge gap surrounding its current capabilities. We are excited to partner with D-Wave Government Inc., a leading quantum provider and subject matter expert to stand up this academy, said Joel Scharlat, Director of Operations with the Cyber Bytes Foundation. These classes have been carefully designed to provide government decision-makers at every level with the information necessary to employ quantum computing to solve todays critical challenges. This is the next step in making the Quantico Cyber Hub the center of technology innovation for the government.

To learn more about the quantum academy, how to register, and how D-Wave Government and CBF are working together to expand government access to quantum computing education and technology, click here. To find out more about D-Wave's technology and value to government agencies and national labs, including NASA, the U.S. Airforce, and the U.S. Naval Research Laboratory click here. To learn more about the Cyber Bytes Foundation click here.

About D-Wave Government Inc.D-Wave is the leader in the development and delivery of quantum computing technology, software, and services, and the worlds first commercial supplier of quantum computers. D-Wave Government Inc., a U.S. subsidiary, was formed in 2013 to provide D-Waves quantum computing technology to the U.S. government. D-Waves quantum technology has been used by some of the worlds most advanced organizations, including Lockheed Martin, Google, NASA Ames, Oak Ridge National Laboratory, and Los Alamos National Laboratory. D-Wave has been granted more than 200 USpatents and has published over 100 scientific papers, many of which have appeared in leading science journals including Nature, Science and Nature Communications.

About the Cyber Bytes FoundationThe Cyber Bytes Foundation is a 501(c)(3), with the mission to establish and sustain a unique cyber ecosystem to produce education, innovation, and outreach programs responsive to our national security challenges. The Quantico Cyber Hub is the largest Cyber Security Center of Excellence in a Virginia HUBZone and is designed as an agnostic Cyber Domain Ecosystem where people (SMEs), processes and capabilities are brought together to customize solutions to accelerate the implementation of Advanced Cyber Technologies through experimentation, innovation, research and application.

ContactD-Wave Systems Inc.dwave@launchsquad.com

View original post here:
D-Wave Government Sponsors a Quantum Academy at the Cyber Bytes Foundation to Accelerate US Government's Adoption of Practical Quantum Computing -...