Archive for the ‘Quantum Computer’ Category

First retail IBM quantum computer headed to Cleveland …

The Discovery Accelerator will serve as the technology foundation for Cleveland Clinics new Global Center for Pathogen Research & Human Health. Over the next decade, researchers will use IBMs cloud, robotics and quantum computing tech to remotely design and synthesise molecules, analyze the molecular features in viral and bacterial genomes to boost drug discovery, and break down and potentially obtain deeper insights from structured and unstructured data at a faster rate than ever.

Through this innovative collaboration, we have a unique opportunity to bring the future to life, said Tom Mihaljevic, M.D., CEO and president of Cleveland Clinic. These new computing technologies can help revolutionize discovery in the life sciences. The Discovery Accelerator will enable our renowned teams to build a forward-looking digital infrastructure and help transform medicine, while training the workforce of the future and potentially growing our economy.

The COVID-19 pandemic has spawned one of the greatest races in the history of scientific discovery one that demands unprecedented agility and speed, said Arvind Krishna, Chairman and Chief Executive Officer of IBM. At the same time, science is experiencing a change of its own with high performance computing, hybrid cloud, data, AI, and quantum computing, being used in new ways to break through long-standing bottlenecks in scientific discovery. Our new collaboration with Cleveland Clinic will combine their world-renowned expertise in healthcare and life sciences with IBMs next-generation technologies to make scientific discovery faster, and the scope of that discovery larger than ever.

Quantum will make the impossible possible, and when the Governor and I announced the Cleveland Innovation District earlier this year, this was the kind of innovative investment I hoped it would advance, said Ohio Lt. Governor Jon Husted, Director of InnovateOhio. A partnership between these two great institutions will put Cleveland, and Ohio, on the map for advanced medical and scientific research, providing a unique opportunity to improve treatment options for patients and solve some of our greatest healthcare challenges.

Source : Cleveland Clinic : Engadget

Read more:
First retail IBM quantum computer headed to Cleveland ...

GlobalFoundries and PsiQuantum partner on full-scale quantum computer – VentureBeat

Join Transform 2021 this July 12-16. Register for the AI event of the year.

PsiQuantum and Globalfoundries have teamed up to manufacture the chips that will become part of the Q1 quantum computer.

Palo Alto, California-based PsiQuantum has plans to create a million-qubit quantum computer. Globalfoundries is a major chipmaker that will manufacture the silicon photonic and electronic chips that are part of the Q1.

The system theyre working on now is the first milestone in PsiQuantums roadmap to deliver a commercially viable quantum computer with 1 million qubits (the basic unit of quantum information) and beyond. PsiQuantum believes silicon photonics, or combining optics with silicon chips, is the only way to scale beyond 1 million qubits and deliver an error-corrected, fault-tolerant, general-purpose quantum computer. PsiQuantum wants to deliver quantum capabilities that drive advances with customers and partners across climate, health care, finance, energy, agriculture, transportation, and communications.

PsiQuantum and GF have now demonstrated a world-first ability to manufacture core quantum components, such as single-photon sources and single-photon detectors, with precision and in volume, using the standard manufacturing processes of GFs world-leading semiconductor fab. The companies have also installed proprietary production and manufacturing equipment in two of Globalfoundries 300-millimeter factories to produce thousands of Q1 silicon photonic chips at its facility in upstate New York and state-of-the-art electronic control chips at its Fab 1 facility in Dresden, Germany.

Above: A Globalfoundries cleanroom.

Image Credit: Globalfoundries

PsiQuantums Q1 system represents breakthroughs in silicon photonics, which the company believes is the only way to scale to a million or more qubits to deliver an error-corrected, fault-tolerant, general-purpose quantum computer.

The Q1 system is the result of five years of development at PsiQuantum by the worlds foremost experts in photonic quantum computing. The team made it their mission to bring the world-changing benefits of quantum computing to reality, based on two fundamental understandings. Globalfoundries is fast becoming a leader in silicon photonics, Moor Insights & Strategy analyst Patrick Moorhead said in an email to VentureBeat. Its announcement with PsiQuantum now adds quantum computing to its SiPho repertoire of datacenter and chip-level connectivity.

First, it focused on a quantum computer capable of performing otherwise impossible calculations requiring a million physical qubits. Second, it leveraged more than 50 years and trillions of dollars invested in the semiconductor industry as the path to creating a commercially viable quantum computer.

Globalfoundries Amir Faintuch said in a statement that we have experienced a decade of technological change in the past year and that the digital transformation and explosion of data now requires quantum computing to accelerate a compute renaissance.

Globalfoundries silicon photonics manufacturing platform enables PsiQuantum to develop quantum chips that can be measured and tested for long-term performance reliability. This is critical to the ability to execute quantum algorithms, which require millions or billions of gate operations. PsiQuantum is collaborating with researchers, scientists, and developers at leading companies to explore and test quantum use cases across a range of industries, including energy, health care, finance, agriculture, transportation, and communications.

Pete Shadbolt, chief strategy officer at PsiQuantum, said in a statement that this is a major achievement for both the quantum and semiconductor industries, demonstrating that its possible to build the critical components of a quantum computer on a silicon chip, using standard manufacturing processes. He said PsiQuantum knew that scaling the system was key. By the middle of the decade, PsiQuantum and Globalfoundries hope to create all the manufacturing lines and processes needed to begin assembling a final machine.

PsiQuantum and Globalfoundries want to play a critical role in ensuring the United States becomes a global leader in quantum computing, supported by a secure, domestic supply chain.

Originally posted here:
GlobalFoundries and PsiQuantum partner on full-scale quantum computer - VentureBeat

Researchers confront major hurdle in quantum computing – University of Rochester

May 4, 2021

Quantum science has the potential to revolutionize modern technology with more efficient computers, communication, and sensing devices. But challenges remain in achieving these technological goals, especially when it comes to effectively transferring information in quantum systems.

A regular computer consists of billions of transistors, called bits. Quantum computers, on the other hand, are based on quantum bits, also known as qubits, which can be made from a single electron.

Unlike ordinary transistors, which can be either 0 (off) or 1 (on), qubits can be both 0 and 1 at the same time. The ability of individual qubits to occupy these so-called superposition states, where they are in multiple states simultaneously, underlies the great potential of quantum computers. Just like ordinary computers, however, quantum computers need a way to transfer quantum information between distant qubitsand that presents a major experimental challenge.

In a series of papers published in Nature Communications, researchers at the University of Rochester, including John Nichol, an assistant professor of physics and astronomy, and graduate students Yadav Kandel and Haifeng Qiao, the lead authors of the papers, report major strides in enhancing quantum computing by improving the transfer of information between electrons in quantum systems.

In one paper, the researchers demonstrated a route of transferring information between qubits, called adiabatic quantum state transfer (AQT), for the first time with electron-spin qubits. Unlike most methods of transferring information between qubits, which rely on carefully tuned electric or magnetic-field pulses, AQT isnt as affected by pulse errors and noise.

To envision how AQT works, imagine you are driving your car and want to park it. If you dont hit your brakes at the proper time, the car wont be where you want it, with potential negative consequences. In this sense, the control pulsesthe gas and brake pedalsto the car must be tuned carefully. AQT is different in that it doesnt really matter how long you press the pedals or how hard you press them: the car will always end up in the right spot. As a result, AQT has the potential to improve the transfer of information between qubits, which is essential for quantum networking and error correction.

The researchers demonstrated AQTs effectiveness by exploiting entanglementone of the basic concepts of quantum physics in which the properties of one particle affect the properties of another, even when the particles are separated by a large distance. The researchers were able to use AQT to transfer one electrons quantum spin state across a chain of four electrons in semiconductor quantum dotstiny, nanoscale semiconductors with remarkable properties. This is the longest chain over which a spin state has ever been transferred, tying the record set by the researchers in a previous Nature paper.

Because AQT is robust against pulse errors and noise, and because of its major potential applications in quantum computing, this demonstration is a key milestone for quantum computing with spin qubits, Nichol says.

In a second paper, the researchers demonstrated another technique of transferring information between qubits, using an exotic state of matter called time crystals. A time crystal is a strange state of matter in which interactions between the particles that make up the crystal can stabilize oscillations of the system in time indefinitely. Imagine a clock that keeps ticking forever; the pendulum of the clock oscillates in time, much like the oscillating time crystal.

By implementing a series of electric-field pulses on electrons, the researchers were able to create a state similar to a time crystal. They found that they could then exploit this state to improve the transfer of an electrons spin state in a chain of semiconductor quantum dots.

Our work takes the first steps toward showing how strange and exotic states of matter, like time crystals, can potentially by used for quantum information processing applications, such as transferring information between qubits, Nichol says. We also theoretically show how this scenario can implement other single- and multi-qubit operations that could be used to improve the performance of quantum computers.

Both AQT and time crystals, while different, could be used simultaneously with quantum computing systems to improve performance.

These two results illustrate the strange and interesting ways that quantum physics allows for information to be sent from one place to another, which is one of the main challenges in constructing viable quantum computers and networks, Nichol says.

Tags: Arts and Sciences, Department of Physics and Astronomy, John Nichol, quantum computing, quantum physics

Category: Science & Technology

More:
Researchers confront major hurdle in quantum computing - University of Rochester

E4 to Participate in Textarossa to Develop Technologies for Energy-Efficient HPC – HPCwire

SCANDIANO, Italy, April 29, 2021 E4 Computer Engineering announces its participation to TEXTAROSSA (Towards EXtreme scale Technologies and Accelerators for HW/SW Supercomputing Applications for exascale), a project co-funded by the European High Performance Computing (EuroHPC) Joint Undertaking to drive innovation in efficiency and usability of high-end HPC systems.

TEXTAROSSA aims to achieve a broad impact on the High Performance Computing (HPC) field both for pre-exascale and exascale scenarios. Within the TEXTAROSSA consortium, E4 will develop innovative heterogeneous HPC platforms powered by the latest generation processors, reconfigurable hardware accelerators and integrating innovative EU-developed cooling equipment. The members of the consortium will test and deploy on these platform advanced algorithms, innovative methods and user-oriented software applications for classic HPC domains as well as for emerging domains in High Performance Artificial Intelligence (HPC-AI) and High Performance Data Analytics (HPDA).

To achieve high performance and high energy efficiency on near-future exascale computing systems, a technology gap needs to be bridged: increase efficiency of computation with carefully selected components and equipment coupled with new arithmetics, as well as providing methods and tools for seamless integration of reconfigurable accelerators in heterogeneous HPC platforms. TEXTAROSSA aims at tackling this gap through applying a consistent co-design approach to develop heterogeneous HPC solutions, supported by the integration and extension of IPs, of programming models and tools derived from other European research projects where the partners TEXTAROSSA are contributing.

TEXTAROSSA, co-funded byEuropean High-Performance Computing Joint Undertaking(JU) and by the Italian Ministero dello Sviluppo Economico (MISE), is coordinated by the Agenzia Nazionale per le Nuove Tecnologie, lEnergia e lo Sviluppo Economico Sostenibile(ENEA) and leverages the expertise of 17 partners located in 5 European countries. The project will run for 3 years with the objectives of

Within TEXTAROSSA and applying a co-design approach, E4s is tasked to develop the IDV (Integrated Development Vehicles), mirroring and extending the European Processor Initiatives ARM64-based architecture. The advanced technologies developed by the partners during the project will be tested and validated on the IDV. To drive the technology development and assess the impact of the proposed innovations, from node to system levels, TEXTAROSSA will use a selected but representative number of HPC, HPDA and AI applications and demonstrators covering challenging HPC domains such as general-purpose numerical kernels, High Energy Physics (HEP), Oil & Gas, climate modelling, as well as emerging domains such as High Performance Data Analytics (HPDA) and High Performance Artificial Intelligence (HPC-AI).

In addition to the technological objectives, TEXTAROSSA aims at fostering the European competitiveness in the development and deployment of advanced solutions for science and industry and make available indispensable tools and systems in the competitive and critical field of HPC. TEXTAROSSA will also provide valuable data to the co-design and development teams of the European Process Initiative (EPI), of which E4 is a member, and constitutes a testbed for mature a EU-developed, innovative two-phase cooling technology enabling this technology to be available in different architectures.

The partners will constantly upload the contents on the website http://www.textarossa.eu

Cosimo Gianfreda, CTO of E4 Computer Engineering: Over the years, E4 has pursued the strategy to be at the leading edge of the cost-effective technology. Our products have been designed taking into account the requirements of the end users and with the goal to make these systems user-friendly while providing top performance at the lowest TCO. The IDV (Integrated Development Vehicles) developed by E4 in TEXTAROSSA will represent a key testbed for validating innovative technologies and a significant leap in proposing high-tech and energy-efficient solutions.

TEXTAROSSA is both a technological challenge and a significant opportunity to develop new products to better serve the needs of our customers. The technological challenges have always been addressed by the E4 R&D Lab, where we test on a daily basis new equipment and new components. E4 brings this know-how and expertise in TEXTAROSSA within a consistent co-design approach. The expertise of our team will be applied in the development of the IDV (Integrated Development Vehicles), which will be thoroughly tested with real-life and emerging applications in a data-center like environment., says Daniele Gregori, Scientific Coordinator of E4 Computer Engineering.

E4 Computer Engineering

E4 Computer Engineering creates and supplies hardware and software solutions for High Performance Computing, Cloud Computing (Private and Hybrid), containerization, High Performance Data Analytics, Artificial Intelligence, Deep Learning and Virtualization. The growth of recent years has led the company to complete its offer with the inclusion of various open-source technologies such as OpenStack, Kubernetes, and tools for the implementation of a CI / CD toolchain.

http://www.e4company.com

Source: E4 Computer Engineering

View original post here:
E4 to Participate in Textarossa to Develop Technologies for Energy-Efficient HPC - HPCwire

How Merck works with Seeqc to cut through quantum computing hype – VentureBeat

Join Transform 2021 this July 12-16. Register for the AI event of the year.

When it comes to grappling with the future of quantum computing, enterprises are scrambling to figure just how seriously they should take this new computing architecture. Many executives are trapped between the anxiety of missing the next wave of innovation and the fear of being played for suckers by people overhyping quantums revolutionary potential.

Thats why the approach to quantum by pharmaceutical giant Merck offers a clear-eyed roadmap for other enterprises to follow. The company is taking a cautious but informed approach that includes setting up an internal working group and partnering with quantum startup Seeqc to monitor developments while keeping an open mind.

According to Philipp Harbach, a theoretical chemist who is head of Mercks In Silico Research group, a big part of the challenge remains trying to keep expectations of executives reasonable even as startup funding to quantum soars and the hype continues to mount.

We are not evangelists of quantum computers, Harbach said. But we are also not skeptics. We are just realistic. If you talk to academics, they tell you there is no commercial value. And if you talk to our management, they tell you in 3 years they want a product out of it. So, there are two worlds colliding that are not very compatible. I think thats typical for every hype cycle.

Mercks desire for the dream of quantum computing to become reality is understandable. The fundamental nature of its business biology and chemistry means the company has been building molecular or quantum level models for more than a century.

Part of the role of the In Silico Research group is to develop those models that can solve quantum problems using evolving technologies such as data analytics and AI and applying them to natural sciences to make experimental work less time-consuming.

But those models are always limited and imperfect because they are being calculated on non-quantum platforms that cant fully mimic the complexity of interactions. If someone can build a fully fault-tolerant quantum computer that operates at sufficient scale and cost, Merck could unlock a new generation of efficiencies and scientific breakthroughs.

The quantum computer will be another augmentation to a classical computer, Harbach said. It wont be a replacement, but an augmentation which will tackle some of these problems in a way that we cannot imagine. Hopefully, it will speed them up in a way that the efficacy of the methods we are employing will be boosted.

About 3 years ago, Merck decided it was time to start educating itself about the emerging quantum sector. The companys venture capital arm, M Ventures, began looking within the company for experts who could help it with due diligence as it began to assess quantum startups. That included mapping out the players and the whole value chain of quantum computing, according to Harbach.

That led to the formal creation of the Quantum Computing Task Force, which has roughly 50 members who try to communicate with quantum players large and small as well as peers among Mercks own competition.

We are basically an interest group trying to understand this topic, Harbach said. Thats why we have a quite good overview and understanding on timelines, player possibilities, and applications.

As part of that exploration, M Ventures eventually began investing in quantum-related startups. In April 2020, the venture fund announced a $5 million investment in Seeqc, a New York-based startup that bills itself as the Digital Quantum Computing company.

We thought that it might be good to have partners in the hardware part and in the software part, Harbach said. Seeqc will partner with us within Merck to really work on problems basically as a hardware partner.

Seeqc is developing a hybrid approach that it believes will make quantum computing useful sooner. The idea is to combine classical computing architectures with quantum computing. It does this through its system-on-a-chip design.

This technology was originally developed at Hypres, a semiconductor electronics developer which spun out Seeqc last year. The M Ventures funding for Seeqc followed a previous $6.8 million seed round. Seeqc raised a subsequent round of $22 million last September in a round led by EQT Ventures.

According to Seeqc CEO John Levy, the companys technology allows it to address some of the fundamental challenges facing quantum systems. Despite rapid advancements in recent years, quantum computers remain too unstable to deliver the high-performance computing needed to justify their costs.

Part of the reason for that is that qubits, the unit of quantum computing power, need to be kept at near-freezing temperatures to process. Scaling then becomes costly and difficult because a system operating with thousands of qubits would be immensely complex to manage, in part because of the massive heating issue.

Levy said Seeqc can address that problem by placing classic microchips over a qubit array to stabilize the environment at cryogenic temperatures while maintaining speed and reducing latency. The company uses a single-flux quantum technology that it has developed and that replaces the microwave pulses being used in other quantum systems. As a result, the company says its platform enables quantum computing at about 1/400 of the cost of current systems in development.

We have taken much of the complexity that youve seen in a quantum computer and weve removed almost all of that by building a set of chips that weve designed, Levy said.

Just as important is a philosophical approach Seeqc is taking. Its not building a general-purpose quantum computer. Instead, it plans to build application-specific ones that are tailored specifically to the problems a client is trying to solve. Because Seeqc has its own chip foundry, it can customize its chips to the needs of application developers as they create different algorithms, Levy said.

In that spirit, Mercks Quantum Computing Task Force is working closely with Seeqc to create viable quantum computers that can be used by its various businesses.

Their technology is a key technology to scale a quantum computer, which is actually much more important because it will make quantum computers bigger and cheaper, Harbach said. And this is, of course, essential for the whole market.

For all this activity, Harbachs view of quantums potential remains sober. He sees nothing on the market that will have any commercial impact, certainly not for Merck. At this point, many of the companys questions remain academic.

What we are basically interested in is how or will the quantum computer hardware ever be scalable to a level that it can tackle problems of realistic size to us, Harbach said. And the same question also goes to the software side. Will there ever be algorithms that can basically mimic these problems on a quantum computer efficiently so that they dont run into noise problems? We are not interested in simulating a molecule right now on a quantum computer. Everything we try to understand is about the timelines: What will be possible and when will it possible.

Harbach has watched the rise in quantum startup funding and various milestone announcements but remains dubious of many of these claims.

They are creating a new market where theres not even the technology ready for it, Harbach said. You have to stay realistic. Theres a lot of money at the moment from governments and VCs. Theres a lot of boost from consultancies because they try to sell the consultancy. And if you talk to experts, its the other way around. They tell you not before 15 years.

The questions Merck asks internally are split into 2 fundamental categories: When will there be a quantum computer that can be more efficient at processing its current quantum models? And when will there be a quantum computer that is so powerful that it opens up new problems and new solutions that the company cannot even imagine today?

Quantum will be a thing, definitely, Harbach said. The only question is when, and Im really, really sure it wont be in the next two years. I wouldnt even say three years. There will be a quantum winter. Winter is coming.

See the original post:
How Merck works with Seeqc to cut through quantum computing hype - VentureBeat