Archive for the ‘Quantum Computer’ Category

E4 to Participate in Textarossa to Develop Technologies for Energy-Efficient HPC – HPCwire

SCANDIANO, Italy, April 29, 2021 E4 Computer Engineering announces its participation to TEXTAROSSA (Towards EXtreme scale Technologies and Accelerators for HW/SW Supercomputing Applications for exascale), a project co-funded by the European High Performance Computing (EuroHPC) Joint Undertaking to drive innovation in efficiency and usability of high-end HPC systems.

TEXTAROSSA aims to achieve a broad impact on the High Performance Computing (HPC) field both for pre-exascale and exascale scenarios. Within the TEXTAROSSA consortium, E4 will develop innovative heterogeneous HPC platforms powered by the latest generation processors, reconfigurable hardware accelerators and integrating innovative EU-developed cooling equipment. The members of the consortium will test and deploy on these platform advanced algorithms, innovative methods and user-oriented software applications for classic HPC domains as well as for emerging domains in High Performance Artificial Intelligence (HPC-AI) and High Performance Data Analytics (HPDA).

To achieve high performance and high energy efficiency on near-future exascale computing systems, a technology gap needs to be bridged: increase efficiency of computation with carefully selected components and equipment coupled with new arithmetics, as well as providing methods and tools for seamless integration of reconfigurable accelerators in heterogeneous HPC platforms. TEXTAROSSA aims at tackling this gap through applying a consistent co-design approach to develop heterogeneous HPC solutions, supported by the integration and extension of IPs, of programming models and tools derived from other European research projects where the partners TEXTAROSSA are contributing.

TEXTAROSSA, co-funded byEuropean High-Performance Computing Joint Undertaking(JU) and by the Italian Ministero dello Sviluppo Economico (MISE), is coordinated by the Agenzia Nazionale per le Nuove Tecnologie, lEnergia e lo Sviluppo Economico Sostenibile(ENEA) and leverages the expertise of 17 partners located in 5 European countries. The project will run for 3 years with the objectives of

Within TEXTAROSSA and applying a co-design approach, E4s is tasked to develop the IDV (Integrated Development Vehicles), mirroring and extending the European Processor Initiatives ARM64-based architecture. The advanced technologies developed by the partners during the project will be tested and validated on the IDV. To drive the technology development and assess the impact of the proposed innovations, from node to system levels, TEXTAROSSA will use a selected but representative number of HPC, HPDA and AI applications and demonstrators covering challenging HPC domains such as general-purpose numerical kernels, High Energy Physics (HEP), Oil & Gas, climate modelling, as well as emerging domains such as High Performance Data Analytics (HPDA) and High Performance Artificial Intelligence (HPC-AI).

In addition to the technological objectives, TEXTAROSSA aims at fostering the European competitiveness in the development and deployment of advanced solutions for science and industry and make available indispensable tools and systems in the competitive and critical field of HPC. TEXTAROSSA will also provide valuable data to the co-design and development teams of the European Process Initiative (EPI), of which E4 is a member, and constitutes a testbed for mature a EU-developed, innovative two-phase cooling technology enabling this technology to be available in different architectures.

The partners will constantly upload the contents on the website http://www.textarossa.eu

Cosimo Gianfreda, CTO of E4 Computer Engineering: Over the years, E4 has pursued the strategy to be at the leading edge of the cost-effective technology. Our products have been designed taking into account the requirements of the end users and with the goal to make these systems user-friendly while providing top performance at the lowest TCO. The IDV (Integrated Development Vehicles) developed by E4 in TEXTAROSSA will represent a key testbed for validating innovative technologies and a significant leap in proposing high-tech and energy-efficient solutions.

TEXTAROSSA is both a technological challenge and a significant opportunity to develop new products to better serve the needs of our customers. The technological challenges have always been addressed by the E4 R&D Lab, where we test on a daily basis new equipment and new components. E4 brings this know-how and expertise in TEXTAROSSA within a consistent co-design approach. The expertise of our team will be applied in the development of the IDV (Integrated Development Vehicles), which will be thoroughly tested with real-life and emerging applications in a data-center like environment., says Daniele Gregori, Scientific Coordinator of E4 Computer Engineering.

E4 Computer Engineering

E4 Computer Engineering creates and supplies hardware and software solutions for High Performance Computing, Cloud Computing (Private and Hybrid), containerization, High Performance Data Analytics, Artificial Intelligence, Deep Learning and Virtualization. The growth of recent years has led the company to complete its offer with the inclusion of various open-source technologies such as OpenStack, Kubernetes, and tools for the implementation of a CI / CD toolchain.

http://www.e4company.com

Source: E4 Computer Engineering

View original post here:
E4 to Participate in Textarossa to Develop Technologies for Energy-Efficient HPC - HPCwire

How Merck works with Seeqc to cut through quantum computing hype – VentureBeat

Join Transform 2021 this July 12-16. Register for the AI event of the year.

When it comes to grappling with the future of quantum computing, enterprises are scrambling to figure just how seriously they should take this new computing architecture. Many executives are trapped between the anxiety of missing the next wave of innovation and the fear of being played for suckers by people overhyping quantums revolutionary potential.

Thats why the approach to quantum by pharmaceutical giant Merck offers a clear-eyed roadmap for other enterprises to follow. The company is taking a cautious but informed approach that includes setting up an internal working group and partnering with quantum startup Seeqc to monitor developments while keeping an open mind.

According to Philipp Harbach, a theoretical chemist who is head of Mercks In Silico Research group, a big part of the challenge remains trying to keep expectations of executives reasonable even as startup funding to quantum soars and the hype continues to mount.

We are not evangelists of quantum computers, Harbach said. But we are also not skeptics. We are just realistic. If you talk to academics, they tell you there is no commercial value. And if you talk to our management, they tell you in 3 years they want a product out of it. So, there are two worlds colliding that are not very compatible. I think thats typical for every hype cycle.

Mercks desire for the dream of quantum computing to become reality is understandable. The fundamental nature of its business biology and chemistry means the company has been building molecular or quantum level models for more than a century.

Part of the role of the In Silico Research group is to develop those models that can solve quantum problems using evolving technologies such as data analytics and AI and applying them to natural sciences to make experimental work less time-consuming.

But those models are always limited and imperfect because they are being calculated on non-quantum platforms that cant fully mimic the complexity of interactions. If someone can build a fully fault-tolerant quantum computer that operates at sufficient scale and cost, Merck could unlock a new generation of efficiencies and scientific breakthroughs.

The quantum computer will be another augmentation to a classical computer, Harbach said. It wont be a replacement, but an augmentation which will tackle some of these problems in a way that we cannot imagine. Hopefully, it will speed them up in a way that the efficacy of the methods we are employing will be boosted.

About 3 years ago, Merck decided it was time to start educating itself about the emerging quantum sector. The companys venture capital arm, M Ventures, began looking within the company for experts who could help it with due diligence as it began to assess quantum startups. That included mapping out the players and the whole value chain of quantum computing, according to Harbach.

That led to the formal creation of the Quantum Computing Task Force, which has roughly 50 members who try to communicate with quantum players large and small as well as peers among Mercks own competition.

We are basically an interest group trying to understand this topic, Harbach said. Thats why we have a quite good overview and understanding on timelines, player possibilities, and applications.

As part of that exploration, M Ventures eventually began investing in quantum-related startups. In April 2020, the venture fund announced a $5 million investment in Seeqc, a New York-based startup that bills itself as the Digital Quantum Computing company.

We thought that it might be good to have partners in the hardware part and in the software part, Harbach said. Seeqc will partner with us within Merck to really work on problems basically as a hardware partner.

Seeqc is developing a hybrid approach that it believes will make quantum computing useful sooner. The idea is to combine classical computing architectures with quantum computing. It does this through its system-on-a-chip design.

This technology was originally developed at Hypres, a semiconductor electronics developer which spun out Seeqc last year. The M Ventures funding for Seeqc followed a previous $6.8 million seed round. Seeqc raised a subsequent round of $22 million last September in a round led by EQT Ventures.

According to Seeqc CEO John Levy, the companys technology allows it to address some of the fundamental challenges facing quantum systems. Despite rapid advancements in recent years, quantum computers remain too unstable to deliver the high-performance computing needed to justify their costs.

Part of the reason for that is that qubits, the unit of quantum computing power, need to be kept at near-freezing temperatures to process. Scaling then becomes costly and difficult because a system operating with thousands of qubits would be immensely complex to manage, in part because of the massive heating issue.

Levy said Seeqc can address that problem by placing classic microchips over a qubit array to stabilize the environment at cryogenic temperatures while maintaining speed and reducing latency. The company uses a single-flux quantum technology that it has developed and that replaces the microwave pulses being used in other quantum systems. As a result, the company says its platform enables quantum computing at about 1/400 of the cost of current systems in development.

We have taken much of the complexity that youve seen in a quantum computer and weve removed almost all of that by building a set of chips that weve designed, Levy said.

Just as important is a philosophical approach Seeqc is taking. Its not building a general-purpose quantum computer. Instead, it plans to build application-specific ones that are tailored specifically to the problems a client is trying to solve. Because Seeqc has its own chip foundry, it can customize its chips to the needs of application developers as they create different algorithms, Levy said.

In that spirit, Mercks Quantum Computing Task Force is working closely with Seeqc to create viable quantum computers that can be used by its various businesses.

Their technology is a key technology to scale a quantum computer, which is actually much more important because it will make quantum computers bigger and cheaper, Harbach said. And this is, of course, essential for the whole market.

For all this activity, Harbachs view of quantums potential remains sober. He sees nothing on the market that will have any commercial impact, certainly not for Merck. At this point, many of the companys questions remain academic.

What we are basically interested in is how or will the quantum computer hardware ever be scalable to a level that it can tackle problems of realistic size to us, Harbach said. And the same question also goes to the software side. Will there ever be algorithms that can basically mimic these problems on a quantum computer efficiently so that they dont run into noise problems? We are not interested in simulating a molecule right now on a quantum computer. Everything we try to understand is about the timelines: What will be possible and when will it possible.

Harbach has watched the rise in quantum startup funding and various milestone announcements but remains dubious of many of these claims.

They are creating a new market where theres not even the technology ready for it, Harbach said. You have to stay realistic. Theres a lot of money at the moment from governments and VCs. Theres a lot of boost from consultancies because they try to sell the consultancy. And if you talk to experts, its the other way around. They tell you not before 15 years.

The questions Merck asks internally are split into 2 fundamental categories: When will there be a quantum computer that can be more efficient at processing its current quantum models? And when will there be a quantum computer that is so powerful that it opens up new problems and new solutions that the company cannot even imagine today?

Quantum will be a thing, definitely, Harbach said. The only question is when, and Im really, really sure it wont be in the next two years. I wouldnt even say three years. There will be a quantum winter. Winter is coming.

See the original post:
How Merck works with Seeqc to cut through quantum computing hype - VentureBeat

GCHQ boss is right to be keeping his eye on quantum computing – Verdict

GCHQ Director, Jeremy Fleming, said on Friday 23 April that the UK needs to prioritize advances in quantum computing if the country wants to prosper and remain secure.

Hes right. The vast amounts of data protected by RSA encryption is under threat of theft and forgery should quantum computing live up to promise.

While such peril remains years away at least, companies and governments worldwide are getting to grips with quantum computing, as the technology leaves the realm of physics laboratories and into the inboxes of presidents and prime ministers.

Classical computers, such as those in our phones, laptops, and even the worlds most powerful supercomputers, conduct computations with ones and zeros binary digits, or bits.

When presented with sufficiently complex problems, classical computers begin to struggle.

Consider this number:

25195908475657893494027183240048398571429282126204032027777137836043662020707595556264018525880784406918290641249515082189298559149176184502808489120072844992687392807287776735971418347270261896375014971824691165077613379859095700097330459748808428401797429100642458691817195118746121515172654632282216869987549182422433637259085141865462043576798423387184774447920739934236584823824281198163815010674810451660377306056201619676256133844143603833904414952634432190114657544454178424020924616515723350778707749817125772467962926386356373289912154831438167899885040445364023527381951378636564391212010397122822120720357

If we were to ask a classical, general-purpose computer which two prime numbers multiply together make this 617-digit number? it would have to essentially guess at each possible combination. Using this method, most estimates suggest it would take around 300 trillion years to crack much longer than the age of the universe. There are ways to speed this up, but this form of encryption is extremely difficult to crack classically.

This is vital for protecting important data and is the kind of problem that underpins RSA encryption which is used to protect vast amounts of data on the internet.

A quantum computer, on the other hand, could figure out the answer in seconds.

While researchers agree that you would need around a few thousand qubits to conduct such a calculation (were only around the 100-qubit mark right now), it is not beyond the realms of possibility for such a feat to be achieved this decade.

With vast use cases, ranging from artificial intelligence (AI) to weather forecasting, quantum computings potential encryption-cracking capabilities should put the technology firmly on the priority list for world leaders and security chiefs.

In the Vincent Briscoe Lecture, Fleming made frequent mention of quantum computing.

He highlighted that a small percentage of technologies must be truly sovereign to retain the UKs strategic technical advantage, and quantum computing is no doubt a core part of this. The elements of cryptographic technology that are a part of these technologies was no doubt an allusion to quantum computing. The country, or corporation, that possess the first full-scale, fault-tolerant quantum computer will be the biggest threat to cryptography the world has ever seen.

Fleming will undoubtably be aware of Chinas quantum supremacy announcement in December 2020, in which a team at the University of Science and Technology of China performed a calculation with a photonic quantum computer 100 trillion times the speed of classical supercomputers.

While photonic devices are so far unprogrammable, in that each can only perform one specific calculation, the progress in China is a wake-up call for Western powers to get to grips with the technology.

The UK is among the leaders in the West, in both spending and academic prowess, but Chinas $15bn of investments into quantum technologies dwarfs the rest of the pack President Biden will no doubt be keeping a close eye on developments in this nascent industry.

Quantum computing is no doubt going to develop significantly as a theme over the coming years, as recent developments indicate. Governments and corporations alike must now take steps to engage, or risk falling behind.

Integer factorization is just one of the applications of quantum computing, in what is becoming a rich ecosystem of research and development. GlobalDatas quantum computing value chain sets out the segments of this growing industry.Related Report Download the full report from GlobalData's Report StoreGet the Report

Latest report from Visit GlobalData Store

See the rest here:
GCHQ boss is right to be keeping his eye on quantum computing - Verdict

IonQ Announces Full Integration of its Quantum Computing Platform with Qiskit – CIO Applications

IonQ is the only company that provides access to its quantum computing platform via both the Amazon Braket and Microsoft Azure clouds, as well as through direct API access.

FREMONT, CA: IonQ announced full integration of its quantum computing platform with Qiskit, an open-source quantum software development kit, or SDK. Qiskit users can now submit programs directly to IonQ's platform without writing any new code. Through the Qiskit Partner Program, this new integration makes IonQ's high-connectivity high-fidelity 11 qubit system available to the 275,000+ enterprise, government, startup, partner, and university members already using Qiskit to create and run quantum programs.

As part of the announcement, IonQ has released an open-source provider library that integrates seamlessly with Qiskit, which can be found on the Qiskit Partners GitHub organization or downloaded via The Python Package Index. Qiskit users with an IonQ account will be able to run their quantum programs on IonQ's cloud quantum computing platform with little to no modificationsimply change the code to point to the IonQ backend and run as usual.

"IonQ is excited to make our quantum computers and APIs easily accessible to the Qiskit community," said IonQ CEO & President Peter Chapman. "Open source has already revolutionized traditional software development. With this integration, we're bringing the world one step closer to the first generation of widely-applicable quantum applications."

This integration builds on IonQ's ongoing success. IonQ recently entered into a merger agreement with dMY Technology Group, Inc. III to go public at an expected valuation of approximately $2 billion. IonQ also recently released a product roadmap setting out its plans to develop modular quantum computers small enough to be networked together in 2023, which could pave the way for broad quantum advantage by 2025. Last year, the company unveiled a new $5.5 million, 23,000 square foot Quantum Data Center in Maryland's Discovery District and announced the development of the world's most powerful quantum computer, featuring 32 perfect atomic qubits with low gate errors and an expected quantum volume greater than 4,000,000.

Continued here:
IonQ Announces Full Integration of its Quantum Computing Platform with Qiskit - CIO Applications

Cleveland Clinic and IBM hope their tech partnership could help prevent the next pandemic – Action News Now

After a year in which scientists raced to understand Covid-19 and to develop treatments and vaccines to stop its spread, Cleveland Clinic is partnering with IBM to use next-generation technologies to advance healthcare research and potentially prevent the next public health crisis.

The two organizations on Tuesday announced the creation of the "Discovery Accelerator," which will apply technologies such as quantum computing and artificial intelligence to pressing life sciences research questions. As part of the partnership, Cleveland Clinic will become the first private-sector institution to buy and operate an on-site IBM quantum computer, called the Q System One. Currently, such machines only exist in IBM labs and data centers.

Quantum computing is expected to expedite the rate of discovery and help tackle problems with which existing computers struggle.

The accelerator is part of Cleveland Clinic's new Global Center for Pathogen Research & Human Health, a facility introduced in January on the heels of a $500 million investment by the clinic, the state of Ohio and economic development nonprofit JobsOhio to spur innovation in the Cleveland area.

The new center is dedicated to researching and developing treatments for viruses and other disease-causing organisms. That will include some research on Covid-19, including why it causes ongoing symptoms (also called "long Covid") for some who have been infected.

"Covid-19 is an example" of how the center and its new technologies will be used, said Dr. Lara Jehi, chief research information officer at the Cleveland Clinic.

"But ... what we want is to prevent the next Covid-19," Jehi told CNN Business. "Or if it happens, to be ready for it so that we don't have to, as a country, put everything on hold and put all of our resources into just treating this emergency. We want to be proactive and not reactive."

Quantum computers process information in a fundamentally different way from regular computers, so they will be able to solve problems that today's computers can't. They can, for example, test multiple solutions to a problem at once, making it possible to come up with an answer in a fraction of the time it would take a different machine.

Applied to healthcare research, that capability is expected to be useful for modeling molecules and how they interact, which could accelerate the development of new pharmaceuticals. Quantum computers could also improve genetic sequencing to help with cancer research, and design more efficient, effective clinical trials for new drugs, Jehi said.

Ultimately, Cleveland Clinic and IBM expect that applying quantum and other advanced technologies to healthcare research will speed up the rate of discovery and product development. Currently, the average time from scientific discovery in a lab to getting a drug to a patient is around 17 years, according to the National Institutes of Health.

"We really need to accelerate," Jehi said. "What we learned with the Covid-19 pandemic is that we cannot afford, as a human race, to just drop everything and focus on one emergency at a time."

Part of the problem: It takes a long time to process and analyze the massive amount of data generated by healthcare, research and trials something that AI, quantum computing and high-performance computing (a more powerful version of traditional computing) can help with. Quantum computers do that by "simulating the world," said Dario Gil, director of IBM Research.

"Instead of conducting physical experiments, you're conducting them virtually, and because you're doing them virtually through computers, it's much faster," Gil said.

For IBM, the partnership represents an important proof point for commercial applications of quantum computing. IBM currently offers access to quantum computers via the cloud to 134 institutions, including Goldman Sachs and Daimler, but building a dedicated machine on-site for one organization is a big step forward.

"What we're seeing is the emergency of quantum as a new industry within the world of information technology and computing," Gil said. "What we're seeing here in the context of Cleveland Clinic is ... a partner that says, 'I want the entire capacity of a full quantum computer to be [dedicated] to my research mission."

The partnership also includes a training element that will help educate people on how to use quantum computing for research which is likely to further grow the ecosystem around the new technology.

Cleveland Clinic and IBM declined to detail the cost of the quantum system being installed on the clinic's campus, but representatives from both organizations called it a "significant investment." Quantum computers are complex machines to build and maintain because they must be stored at extremely cold temperatures (think: 200 times colder than outer space).

The Cleveland Clinic will start by using IBM's quantum computing cloud offering while waiting for its on-premises machine to be built, which is expected to take about a year. IBM plans to later install at the clinic a more advanced version of its quantum computer once it is developed in the coming years.

Jehi, the Cleveland Clinic research lead, acknowledged that quantum computing technology is still nascent, but said the organization wanted to get in on the ground floor.

"It naturally needs nurturing and growing so that we can figure out what are its applications in healthcare," Jehi said. "It was important to us that we design those applications and we learn them ourselves, rather than waiting for others to develop them."

Visit link:
Cleveland Clinic and IBM hope their tech partnership could help prevent the next pandemic - Action News Now