Archive for the ‘Quantum Computing’ Category

Swedish university is behind quantum computing breakthrough – ComputerWeekly.com

Swedens Chalmers University of Technology has achieved a quantum computing efficiency breakthrough through a novel type of thermometer that is capable of simplifying and rapidly measuring temperatures during quantum calculations.

The discovery adds a more advanced benchmarking tool that will accelerate Chalmers work in quantum computing development.

The novel thermometer is the latest innovation to emerge from the universitys research to develop an advanced quantum computer. The so-called OpenSuperQ project at Chalmers is coordinated with technology research organisation the Wallenberg Centre for Quantum Technology (WACQT), which is the OpenSuperQ projects main technology partner.

WACQT has set the goal of building a quantum computer capable of performing precise calculations by 2030. The technical requirements behind this ambitious target are based on superconducting circuits and developing aquantum computer with at least 100 well-functioning qubits. To realise this ambition, the OpenSuperQ project will require a processor working temperature close to absolute zero, ideally as low as 10 millikelvin (-273.14 C).

Headquartered at Chalmers Universitys research hub in Gothenburg, the OpenSuperQ project, launched in 2018, is intended to run until 2027. Working alongside the university in Gothenburg, WACQT is also operating support projects being run at the Royal Institute of Technology (Kungliga Tekniska Hgskolan) in Stockholm and collaborating universities in Lund, Stockholm, Linkping and Gothenburg.

Pledged capital funding for the WACQT-managed OpenSuperQ project which has been committed by the Knut and Alice Wallenberg Foundation together with 20 other private corporations in Sweden, currently amounts to SEK1.3bn (128m). In March, the foundation scaled up its funding commitment to WACQT, doubling its annual budget to SEK80m over the next four years.

The increased funding by the foundation will lead to the expansion of WACQTs QC research team, and the organisation is looking to recruit a further 40 researchers for the OpenSuperQ project in 2021-2022. A new team is to be established to study nanophotonic devices, which can enable the interconnection of several smaller quantum processors into a large quantum computer.

The Wallenberg sphere incorporates 16 public and private foundations operated by various family members. Each year, these foundations allocate about SEK2.5bn to research projects in the fields of technology, natural sciences and medicine in Sweden.

The OpenSuperQ project aims to take Sweden to the forefront of quantum technologies, including computing, sensing, communications and simulation, said Peter Wallenberg, chairman of the Knut and Alice Wallenberg Foundation.

Quantum technology has enormous potential, so it is vital that Sweden has the necessary expertise in this area. WACQT has built up a qualified research environment and established collaborations with Swedish industry. It has succeeded in developing qubits with proven problem-solving ability. We can move ahead with great confidence in what WACQT will go on to achieve.

The novel thermometer breakthrough opens the door to experiments in the dynamic field of quantum thermodynamics, said Simone Gasparinetti, assistant professor at Chalmers quantum technology laboratory.

Our thermometer is a superconducting circuit and directly connected to the end of the waveguide being measured, said Gasparinetti. It is relatively simple and probably the worlds fastest and most sensitive thermometer for this particular purpose at the millikelvin scale.

Coaxial cables and waveguides the structures that guide waveforms and serve as the critical connection to the quantum processor remain key components in quantum computers. The microwave pulses that travel down the waveguides to the quantum processor are cooled to extremely low temperatures along the way.

For researchers, a fundamental goal is to ensure that these waveguides are not carrying noise due to the thermal motion of electrons on top of the pulses that they send. Precise temperature measurement readings of the electromagnetic fields are needed at the cold end of the microwave waveguides, the point where the controlling pulses are delivered to the computers qubits.

Working at the lowest possible temperature minimises the risk of introducing errors in the qubits. Until now, researchers have only been able to measure this temperature indirectly, and with relatively long delays. Chalmers Universitys novel thermometer enables very low temperatures to be measured directly at the receiving end of the waveguide with elevated accuracy and with extremely high time resolution.

The novel thermometer developed at the university provides researchers with a value-added tool to measure the efficiency of systems while identifying possible shortcomings, said Per Delsing, a professor at the department of microtechnology and nanoscience at Chalmers and director of WACQT.

A certain temperature corresponds to a given number of thermal photons, and that number decreases exponentially with temperature, he said. If we succeed in lowering the temperature at the end where the waveguide meets the qubit to 10 millikelvin, the risk of errors in our qubits is reduced drastically.

The universitys primary role in the OpenSuperQ project is to lead the work on developing the application algorithms that will be executed on the OpenSuperQ quantum computer. It will also support the development of algorithms for quantum chemistry, optimisation and machine learning.

Also, Chalmers will head up efforts to improve quantum coherence in chips with multiple coupled qubits, including device design, process development, fabrication, packaging and testing. It will also conduct research to evaluate the performance of 2-qubit gates and develop advanced qubit control methods to mitigate systematic and incoherent errors to achieve targeted gate fidelities.

The rest is here:
Swedish university is behind quantum computing breakthrough - ComputerWeekly.com

A Tectonic Shift in Analytics and Computing Is Coming – Eos

More than 50 years ago, a fundamental scientific revolution occurred, sparked by the concurrent emergence of a huge amount of new data on seafloor bathymetry and profound intellectual insights from researchers rethinking conventional wisdom. Data and insight combined to produce the paradigm of plate tectonics. Similarly, in the coming decade, a new revolution in data analytics may rapidly overhaul how we derive knowledge from data in the geosciences. Two interrelated elements will be central in this process: artificial intelligence (AI, including machine learning methods as a subset) and high-performance computing (HPC).

Already today, geoscientists must understand modern tools of data analytics and the hardware on which they work. Now AI and HPC, along with cloud computing and interactive programming languages, are becoming essential tools for geoscientists. Here we discuss the current state of AI and HPC in Earth science and anticipate future trends that will shape applications of these developing technologies in the field. We also propose that it is time to rethink graduate and professional education to account for and capitalize on these quickly emerging tools.

Great strides in AI capabilities, including speech and facial recognition, have been made over the past decade, but the origins of these capabilities date back much further. In 1971, the U.S. Defense Advanced Research Projects Agency substantially funded a project calledSpeech Understanding Research, and it was generally believed at the time that artificial speech recognition was just around the corner. We know now that this was not the case, as todays speech and writing recognition capabilities emerged only as a result of both vastly increased computing power and conceptual breakthroughs such as the use of multilayered neural networks, which mimic the biological structure of the brain.

Artificial intelligence (AI) and many other artificial computing tools are still in their infancy, which has important implications for high-performance computing (HPC) in the geosciences.Recently, AI has gained the ability to create images of artificial faces that humans cannot distinguish from real ones by using generative adversarial networks (GANs). These networks combine two neural networks, one that produces a model and a second one that tries to discriminate the generated model from the real one. Scientists have now started to use GANs to generate artificial geoscientific data sets.

These and other advances are striking, yet AI and many other artificial computing tools are still in their infancy. We cannot predict what AI will be able to do 2030 years from now, but a survey of existing AI applications recently showed that computing power is the key when targeting practical applications today. The fact that AI is still in its early stages has important implications for HPC in the geosciences. Currently, geoscientific HPC studies have been dominated by large-scale time-dependent numerical simulations that use physical observations to generate models [Morra et al, 2021a]. In the future, however, we may work in the other directionEarth, ocean, and atmospheric simulations may feed large AI systems that in turn produce artificial data sets that allow geoscientific investigations, such as Destination Earth, for which collected data are insufficient.

Development of AI capabilities is well underway in certain geoscience disciplines. For a decade now [Ma et al., 2019], remote sensing operations have been using convolutional neural networks (CNNs), a kind of neural network that adaptively learns which features to look at in a data set. In seismology (Figure 1), pattern recognition is the most common application of machine learning (ML), and recently, CNNs have been trained to find patterns in seismic data [Kong et al., 2019], leading to discoveries such as previously unrecognized seismic events [Bergen et al., 2019].

New AI applications and technologies are also emerging; these involve, for example, the self-ordering of seismic waveforms to detect structural anomalies in the deep mantle [Kim et al., 2020]. Recently, deep generative models, which are based on neural networks, have shown impressive capabilities in modeling complex natural signals, with the most promising applications in autoencoders and GANs (e.g., for generating images from data).

CNNs are a form of supervised machine learning (SML), meaning that before they are applied for their intended use, they are first trained to find prespecified patterns in labeled data sets and to check their accuracy against an answer key. Training a neural network using SML requires large, well-labeled data sets as well as massive computing power. Massive computing power, in turn, requires massive amounts of electricity, such that the energy demand of modern AI models is doubling every 3.4 months and causing a large and growing carbon footprint.

AI is starting to improve the efficiency of geophysical sensors: Some sensors use AI to detect when interesting data are recorded, and these data are selectively stored.In the future, the trend in geoscientific applications of AI might shift from using bigger CNNs to using more scalable algorithms that can improve performance with less training data and fewer computing resources. Alternative strategies will likely involve less energy-intensive neural networks, such as spiking neural networks, which reduce data inputs by analyzing discrete events rather than continuous data streams.

Unsupervised ML (UML), in which an algorithm identifies patterns on its own rather than searching for a user-specified pattern, is another alternative to data-hungry SML. One type of UML identifies unique features in a data set to allow users to discover anomalies of interest (e.g., evidence of hidden geothermal resources in seismic data) and to distinguish trends of interest (e.g., rapidly versus slowly declining production from oil and gas wells based on production rate transients) [Vesselinov et al., 2019].

AI is also starting to improve the efficiency of geophysical sensors. Data storage limitations require instruments such as seismic stations, acoustic sensors, infrared cameras, and remote sensors to record and save data sets that are much smaller than the total amount of data they measure. Some sensors use AI to detect when interesting data are recorded, and these data are selectively stored. Sensor-based AI algorithms also help minimize energy consumption by and prolong the life of sensors located in remote regions, which are difficult to service and often powered by a single solar panel. These techniques include quantized CNN (using 8-bit variables) running on minimal hardware, such as Raspberry Pi [Wilkes et al., 2017].

Powerful, efficient algorithms and software represent only one part of the data revolution; the hardware and networks that we use to process and store data have evolved significantly as well.

Since about 2004, when the increase in frequencies at which processors operate stalled at about 3 gigahertz (the end of Moores law), computing power has been augmented by increasing the number of cores per CPU and by the parallel work of cores in multiple CPUs, as in computing clusters.

Accelerators such as graphics processing units (GPUs), once used mostly for video games, are now routinely used for AI applications and are at the heart of all major ML facilities (as well the U.S. Exascale Strategy, a part of the National Strategic Computing Initiative). For example, Summit and Sierra, the two fastest supercomputers in the United States, are based on a hierarchical CPU-GPU architecture. Meanwhile, emerging tensor processing units, which were developed specifically for matrix-based operations, excel at the most demanding tasks of most neural network algorithms. In the future, computers will likely become increasingly heterogeneous, with a single system combining several types of processors, including specialized ML coprocessors (e.g., Cerebras) and quantum computing processors.

Computational systems that are physically distributed across remote locations and used on demand, usually called cloud computing, are also becoming more common, although these systems impose limitations on the code that can be run on them. For example, cloud infrastructures, in contrast to centralized HPC clusters and supercomputers, are not designed for performing large-scale parallel simulations. Cloud infrastructures face limitations on high-throughput interconnectivity, and the synchronization needed to help multiple computing nodes coordinate tasks is substantially more difficult to achieve for physically remote clusters. Although several cloud-based computing providers are now investing in high-throughput interconnectivity, the problem of synchronization will likely remain for the foreseeable future.

AI has proven invaluable in discovering and analyzing patterns in large, real-world data sets. It could also become a source of realistic artificial data sets.Artificial intelligence has proven invaluable in discovering and analyzing patterns in large, real-world data sets. It could also become a source of realistic artificial data sets, generated through models and simulations. Artificial data sets enable geophysicists to examine problems that are unwieldy or intractable using real-world databecause these data may be too costly or technically demanding to obtainand to explore what-if scenarios or interconnected physical phenomena in isolation. For example, simulations could generate artificial data to help study seismic wave propagation; large-scale geodynamics; or flows of water, oil, and carbon dioxide through rock formations to assist in energy extraction and storage.

HPC and cloud computing will help produce and run 3D models, not only assisting in improved visualization of natural processes but also allowing for investigation of processes that cant be adequately studied with 2D modeling. In geodynamics, for example, using 2D modeling makes it difficult to calculate 3D phenomena like toroidal flow and vorticity because flow patterns are radically different in 3D. Meanwhile, phenomena like crustal porosity waves (waves of high porosity in rocks; Figure 2) and corridors of fast-moving ice in glaciers require extremely high spatial and temporal resolutions in 3D to capture [Rss et al., 2020].

Fig. 2. A 3D modeling run with 16 billion degrees of freedom simulates flow focusing in porous media and identifies a pulsed behavior phenomenon called porosity waves. Credit: Rss et al. [2018], CC BY 4.0Adding an additional dimension to a model can require a significant increase in the amount of data processed. For example, in exploration seismology, going from a 2D to a 3D simulation involves a transition from requiring three-dimensional data (i.e., source, receiver, time) to five-dimensional data (source x, source y, receiver x, receiver y, and time [e.g., Witte et al., 2020]). AI can help with this transition. At the global scale, for example, the assimilation of 3D simulations in iterative full-waveform inversions for seismic imaging was performed recently with limited real-world data sets, employing AI techniques to maximize the amount of information extracted from seismic traces while maintaining the high quality of the data [Lei et al., 2020].

Interactive programming and language-agnostic programming environments are young techniques that will facilitate introducing computing to geoscientists.As far as weve come in developing AI for uses in geoscientific research, there is plenty of room for growth in the algorithms and computing infrastructure already mentioned, as well as in other developing technologies. For example, interactive programming, in which the programmer develops new code while a program is active, and language-agnostic programming environments that can run code in a variety of languages are young techniques that will facilitate introducing computing to geoscientists.

Programming languages, such as Python and Julia, which are now being taught to Earth science students, will accompany the transition to these new methods and will be used in interactive environments such as the Jupyter Notebook. Julia was shown recently to perform well as compiled code for machine learning algorithms in its most recent implementations, such as the ones using differentiable programming, which reduces computational resource and energy requirements.

Quantum computing, which uses the quantum states of atoms rather than streams of electrons to transmit data, is another promising development that is still in its infancy but that may lead to the next major scientific revolution. It is forecast that by the end of this decade, quantum computers will be applied in solving many scientific problems, including those related to wave propagation, crustal stresses, atmospheric simulations, and other topics in the geosciences. With competition from China in developing quantum technologies and AI, quantum computing and quantum information applications may become darlings of major funding opportunities, offering the means for ambitious geophysicists to pursue fundamental research.

Taking advantage of these new capabilities will, of course, require geoscientists who know how to use them. Today, many geoscientists face enormous pressure to requalify themselves for a rapidly changing job market and to keep pace with the growing complexity of computational technologies. Academia, meanwhile, faces the demanding task of designing innovative training to help students and others adapt to market conditions, although finding professionals who can teach these courses is challenging because they are in high demand in the private sector. However, such teaching opportunities could provide a point of entry for young scientists specializing in computer science or part-time positions for professionals retired from industry or national labs [Morra et al., 2021b].

The coming decade will see a rapid revolution in data analytics that will significantly affect the processing and flow of information in the geosciences. Artificial intelligence and high-performance computing are the two central elements shaping this new landscape. Students and professionals in the geosciences will need new forms of education enabling them to rapidly learn the modern tools of data analytics and predictive modeling. If done well, the concurrence of these new tools and a workforce primed to capitalize on them could lead to new paradigm-shifting insights that, much as the plate tectonic revolution did, help us address major geoscientific questions in the future.

The listed authors thank Peter Gerstoft, Scripps Institution of Oceanography, University of California, San Diego; Henry M. Tufo, University of Colorado Boulder; and David A. Yuen, Columbia University and Ocean University of China, Qingdao, who contributed equally to the writing of this article.

Continued here:
A Tectonic Shift in Analytics and Computing Is Coming - Eos

Toshiba and 10 other Japanese firms to form quantum tech tie-up – The Japan Times

Eleven Japanese companies, including Toshiba Corp., plan to jointly launch a council this summer in a bid to create a new industry using quantum technology, the companies said Monday.

The council will identify and discuss issues linked to quantum computers, quantum cryptography and other base technologies, as well as those on related human resources and rules, with an aim to explore the possibility of industrializing quantum technology, which is widely expected to play a key role in national security.

Quantum computers have much higher computing capabilities than conventional computers, while quantum cryptography theoretically makes wiretapping impossible.

With the council, the 11 companies, which also include Toyota Motor Corp. and Nippon Telegraph and Telephone Corp., hope to boost their presence at a time when U.S. and Chinese players are vying with each other in the research and development of quantum technology.

This is an all-Japan system aimed at making the country a world innovation leader in quantum technology, Toshiba President Satoshi Tsunakawa said at the councils foundation meeting on Monday.

Also on Monday, online flea market operator Mercari Inc. said it has set up an organization for the research and development of quantum internet, jointly with Keio University and the University of Tokyo.

Quantum internet, which involves the exchange of quantum data, allows users to have safe communication, protecting them from prying eyes.

Mercari plans to encourage other companies to join the organization. Shota Nagayama, a senior researcher at Mercari, said, We want more firms to join our efforts to accumulate knowledge.

The new organization aims to put quantum internet into practical use on a trial basis in 15 years.

These moves highlight that industry-government-academia cooperation is gaining momentum in Japan over the research and development of quantum technology, which is applicable to many sorts of businesses.

In a time of both misinformation and too much information, quality journalism is more crucial than ever.By subscribing, you can help us get the story right.

PHOTO GALLERY (CLICK TO ENLARGE)

View post:
Toshiba and 10 other Japanese firms to form quantum tech tie-up - The Japan Times

Global IT giant to partner with U of C on quantum computing centre – Calgary Herald

Breadcrumb Trail Links

A global IT giant has announced plans to partner with the University of Calgary to create a centre of excellence for quantum computing in the city.

Author of the article:

A global IT giant has announced plans to partner with the University of Calgary to create a centre of excellence for quantum computing in the city.

Bangalore-based Mphasis Ltd., a provider of IT outsourcing services, announced Wednesday that it will set up a Canadian headquarters in Calgary. The move is expected to create 500 to 1,000 local jobs within the next two to three years, according to company CEO Nitin Rakesh.

The company will also establish what it dubs the Quantum City Centre of Excellence at the University of Calgary to serve as a hub for companies focused on the commercial development of quantum technologies. Mphasis will be the anchor tenant and will work to draw in other companies working in the field.

Quantum computing uses the principles of quantum physics to solve problems. It is considered to be a huge leap forward from traditional computer technology, and has futuristic applications in the fields of medicine, energy, fintech, logistics and more.

This advertisement has not loaded yet, but your article continues below.

In a virtual news conference Wednesday, Premier Jason Kenney called quantum computing one of the most promising emerging high-tech sectors. He said the partnership between Mphasis and the University of Calgary will help make Alberta a destination of choice for investment capital and talent in this growing field.

The goal is to make Alberta a force to be reckoned with in quantum computing, machine learning and AI economically, but also intellectually, Kenney said. Post-secondary students will have incredible opportunities to master the most sought-after skills through this venture.

Mphasis also announced its plans to establish Sparkle Calgary, which will offer training in artificial intelligence and automation technology for Albertans seeking a career transition. Rakesh said through this platform, Mphasis hopes to help address the skills shortage that currently plagues Albertas tech sector, while at the same time helping out-of-work Albertans find a place in the new economy.

Theres a ton of data expertise that sits at the heart of the oil and gas industry, Rakesh said. So can we take that ability to apply data knowledge, data science, and really re-skill (those workers) toward cloud computing . . . Thats the vision we want to see.

The University of Calgary has been working for some time to help establish Alberta as a leader for quantum computing research through its Institute for Quantum Science and Technology a multidisciplinary group of researchers from the areas of computer science, mathematics, chemistry and physics. The U of C is also a member of Quantum Alberta, which aims to accelerate Quantum Science research, development and commercialization in the province.

This advertisement has not loaded yet, but your article continues below.

U of C president Ed McCauley said Wednesday he hopes that the partnership with Mphasis will lead to the birth of a new wave of startup companies in Calgary, ones that will use cutting-edge technology developed on campus.

This (quantum) technology will not only create its own industry, but it will fuel advances in others, McCauley said. Calgary will not only be an energy capital, it will be a quantum capital, too.

The federal government has identified quantum computing as critically important to the future economy. The most recent federal budget includes $360 million for a National Quantum Strategy encompassing funding for research, students and skills development.

Mphasis is the second major Indian IT company in recent months to announce it will set up shop in Calgary. In March, Infosys a New York Stock Exchange-listed global consulting and IT services firm with more than 249,000 employees worldwide said it will bring 500 jobs to the city over the next three years as part of the next phase of its Canadian expansion.

Like Mphasis, Infosys has formed partnerships with Calgarys post-secondary institutions to invest jointly in training programs that will help to develop a local technology talent pool.

astephenson@postmedia.com

This advertisement has not loaded yet, but your article continues below.

Sign up to receive daily headline news from the Calgary Herald, a division of Postmedia Network Inc.

A welcome email is on its way. If you don't see it, please check your junk folder.

The next issue of Calgary Herald Headline News will soon be in your inbox.

We encountered an issue signing you up. Please try again

Postmedia is committed to maintaining a lively but civil forum for discussion and encourage all readers to share their views on our articles. Comments may take up to an hour for moderation before appearing on the site. We ask you to keep your comments relevant and respectful. We have enabled email notificationsyou will now receive an email if you receive a reply to your comment, there is an update to a comment thread you follow or if a user you follow comments. Visit our Community Guidelines for more information and details on how to adjust your email settings.

See the original post:
Global IT giant to partner with U of C on quantum computing centre - Calgary Herald

Jlich, University of Wrzburg Investigating Innovations for Quantum Computing with Topological Insulators – HPCwire

JLICH and WRZBURG, Germany, June 1, 2021 Forschungszentrum Jlich and the University of Wrzburg will together investigate the quantum phenomena of topological materials and the opportunities they present within quantum computing. The Free State of Bavaria is funding the project to the tune of 13 million.

Numerous research groups worldwide are working on the development of quantum computers. Such computers will offer numerous advantages when they are ready for application. They require very little energy and provide extremely fast computing power as well as a high level of data security.

However, a number of technical challenges still need to be overcome. To achieve further progress in this regard, Forschungszentrum Jlich and the University of Wrzburg (JMU) are strengthening their long-standing cooperation in this field.

The project partners are turning to topological insulators as a material class. Together, they aim to research and develop topological material systems that would serve as suitable components for quantum computers.

Jlich and JMU: A strong partnership

Wolfgang Marquardt, Chairman of the Board of Directors of Forschungszentrum Jlich, and then JMU President Alfred Forchel signed a cooperation agreement to that effect in March 2021.

The cooperation with Jlich provides JMU with a great opportunity, Forchel explains. We already have outstanding resources in Wrzburg in the fields of solid-state physics, semiconductor physics, and topological materials. In Forschungszentrum Jlich, we have a strong partner whose expertise complements our own very nicely. Together, we can lead the way in topological quantum computing.

Wolfgang Marquardt, Chairman of the Board of Directors of Forschungszentrum Jlich, adds: The development of highly complex technologies such as those required for quantum computing can only be successfully achieved through sharing expertise and through the cooperation of strong partners. This cooperation is an important foundation to bring together the complementary expertise of JMU and Forschungszentrum Jlich as part of a joint effort to explore the possibilities of topological materials for robust quantum computers and thus to create a hub for new, solid-state quantum innovations.

Funding from Bavaria

The Bavarian Ministry of Economic Affairs, Regional Development and Energy is providing roughly 13 million in funding to the project to investigate quantum computing on the basis of topological materials through experimental and theoretical approaches. Bavarias minister president Markus Sder had announced this investment at the end of 2019 as part of the states Hightech Agenda Bayern initiative.

Four research groups involved

Funding is to be provided to four research groups. This funding will be used to establish four young investigators groups at both research locations.

From JMU, the teams of professors Laurens Molenkamp (experimental physics) and Bjrn Trauzettel (theoretical physics) are taking part in the cooperation. Both teams aim to host young researchers from Jlich who will set up their own young investigators groups in Wrzburg. The idea behind this is as follows: The young people will act as a kind of human bridge bringing expertise from Jlich to Wrzburg and vice versa, explains Trauzettel.

At Jlich, the subsinstitutes of the Peter Grnberg Institute (PGI) specializing in the fields of solid-state physics and theoretical physics are participating, led by professors Detlev Grtzmacher (PGI-9), Stefan Tautz (PGI-3), Stefan Blgel (PGI-1), and David DiVincenzo (PGI-2). Through the continuation of the Virtual Institute for Topological Insulators, which is funded by the Helmholtz Association, synergies in research into topological insulators will now be used in closer scientific collaboration to establish a pathway towards quantum computing, says Grtzmacher to explain the high hopes being placed in this project.

Long-standing cooperation in an excellent environment

Various collaborations in the fields of physics and information technology materials have been in place between Forschungszentrum Jlich and JMU for over ten years now. In 2012, the Virtual Institute for Topological Insulators (VITI) was jointly founded by the two partners. In light of the promising developments in topological quantum computing, both parties have decided to strengthen this cooperation in the form of joint working groups.

The research collaboration operates in an outstanding environment with two clusters of excellence related to the field: Complexity and Topology in Quantum Matter (CT.QMAT) (Wrzburg-Dresden) and Matter and Light for Quantum Computing (ML4Q) (CologneAachenBonnJlich).

A Helmholtz Quantum Center is also being built at Jlich. At JMU, a new building is under construction for the Institute for Topological Insulators (ITI). The first research teams are scheduled to move into the new building as of mid-2021.

Source: Forschungszentrum Jlich

See the original post here:
Jlich, University of Wrzburg Investigating Innovations for Quantum Computing with Topological Insulators - HPCwire