Archive for the ‘Quantum Computing’ Category

IBM partners with U.K. on $300M quantum computing research initiative – VentureBeat

Elevate your enterprise data technology and strategy at Transform 2021.

The U.K. government and IBM this week announced a five-year 210 million ($297.5 million) artificial intelligence (AI) and quantum computing collaboration, in the hopes of making new discoveries and developing sustainable technologies in fields ranging from life sciences to manufacturing.

The program will hire 60 scientists, as well as bringing in interns and students to work under the auspices of IBM Research and the U.K.s Science and Technology Facilities Council (STFC) at the Hartree Centre in Daresbury, Cheshire. The newly formed Hartree National Centre for Digital Innovation (HNCDI) will apply AI, high performance computing (HPC) and data analytics, quantum computing, and cloud technologies to advance research in areas like materials development and environmental sustainability, IBM said in a statement.

Artificial intelligence and quantum computing have the potential to revolutionize everything from the way we travel to the way we shop. They are exactly the kind of fields I want the U.K. to be leading in, U.K. Science Minister Amanda Solloway said.

The Hartree Centre was opened in 2012 by UK Research and Innovations STFC as an HPC, data analytics, and AI research facility. Its housed within Sci-Tech Daresburys laboratory for research in accelerator science, biomedicine, physics, chemistry, materials, engineering, computational science, and more.

The program is part of IBMs Discovery Accelerator initiative to accelerate discovery and innovation based on a convergence of advanced technologies at research centers like HNCDI, the company said. This will be IBMs first Discovery Accelerator research center in Europe.

As part of the HNCDI program, the STFC Hartree Center is joining over 150 global organizations, ranging from Fortune 500 companies to startups, with an IBM Hybrid Cloud-accessible connection to the IBM Quantum Network. The Quantum Network is the computing giants assembly of premium quantum computers and development tools. IBM will also provide access to its commercial and experimental AI products and tools for work in areas like material design, scaling and automation, supply chain logistics, and trusted AI applications, the company said.

IBM has been busy inking Discovery Accelerator deals with partners this year. The company last month made a $200 million investment in a 10-year joint project with the Grainger College of Engineering at the University of Illinois Urbana-Champaign (UIUC). As with the HNCDI in the U.K., the planned IBM-Illinois Discovery Accelerator Institute at UIUC will build out new research facilities and hire faculty and technicians.

Earlier this year, IBM announced a 10-year quantum computing collaboration with the Cleveland Clinic to build the computational foundation of the future Cleveland Clinic Global Center for Pathogen Research & Human Health. That project will see the installation of the first U.S.-based on-premises, private sector IBM Quantum System One, the company said. In the coming years, IBM also plans to install one of its first next-generation 1,000+ qubit quantum systems at another Cleveland client site.

The pandemic added urgency to the task of harnessing quantum computing, AI, and other cutting-edge technologies to help solve medicines most pressing problems, IBM chair and CEO Arvind Krishna said in March at the time of the Cleveland Clinic announcement.

The COVID-19 pandemic has spawned one of the greatest races in the history of scientific discovery one that demands unprecedented agility and speed, Krishna said in a statement.

At the same time, science is experiencing a change of its own with high-performance computing, hybrid cloud, data, AI, and quantum computing being used in new ways to break through long-standing bottlenecks in scientific discovery. Our new collaboration with Cleveland Clinic will combine their world-renowned expertise in health care and life sciences with IBMs next-generation technologies to make scientific discovery faster and the scope of that discovery larger than ever, Krishna said.

Go here to read the rest:
IBM partners with U.K. on $300M quantum computing research initiative - VentureBeat

Looking to the future of quantum cloud computing – Siliconrepublic.com – Siliconrepublic.com

Trinity College Dublins Dan Kilper and University of Arizonas Saikat Guha discuss the quantum cloud and how it could be achieved.

Quantum computing has been receiving a lot of attention in recent years as several web-scale providers race towards so-called quantum advantage the point at which a quantum computer is able to exceed the computing abilities of classical computing.

Large public sector investments worldwide have fuelled research activity within the academic community. The first claim of quantum advantage emerged in 2019 when Google, NASA and Oak Ridge National Laboratory (ORNL) demonstrated a computation that the quantum computer completed in 200 seconds and that the ORNL supercomputer verified up to the point of quantum advantage, estimated to require 10,000 years to complete to the end.

Roadmaps that take quantum computers even further into this regime are advancing steadily. IBM has made quantum computers available for online access for many years now and recently Amazon and Microsoft started cloud services to provide access for users to several different quantum computing platforms. So, what comes next?

The step beyond access to a single quantum computer is access to a network of quantum computers. We are starting to see this emerge from the web or cloud-based quantum computers offered by cloud providers effectively quantum computing as a service, sometimes referred to as cloud-based quantum computing.

This consists of quantum computers connected by classical networks and exchanging classical information in the form of bits, or digital ones and zeros. When quantum computers are connected in this way, they each can perform separate quantum computations and return the classical results that the user is looking for.

It turns out that with quantum computers, there are other possibilities. Quantum computers perform operations on quantum bits, or qubits. It is possible for two quantum computers to exchange information in the form of qubits instead of classical bits. We refer to networks that transport qubits as quantum networks. If we can connect two or more quantum computers over a quantum network, then they will be able to combine their computations such that they might behave as a single larger quantum computer.

Quantum computing distributed over quantum networks thus has the potential to significantly enhance the computing power of quantum computers. In fact, if we had quantum networks today, many believe that we could immediately build large quantum computers far into the advantage regime simply by connecting many instances of todays quantum computers over a quantum network. With quantum networks built, and interconnected at various scales, we could build a quantum internet. And at the heart of this quantum internet, one would expect to find quantum computing clouds.

At present, scientists and engineers are still working on understanding how to construct such a quantum computing cloud. The key to quantum computing power is the number of qubits in the computer. These are typically micro-circuits or ions kept at cryogenic temperatures, near minus 273 degrees Celsius.

While these machines have been growing steadily in size, it is expected that they will eventually reach a practical size limit and therefore further computing power is likely to come from network connections across quantum computers within the data centre, very much like todays current classical computing data centres. Instead of racks of servers, one would expect rows of cryostats.

Quantum computing distributed over quantum networks has the potential to significantly enhance the computing power of quantum computers

Once we start imagining a quantum internet, we quickly realise that there are many software structures that we use in the classical internet that might need some type of analogue in the quantum internet.

Starting with the computers, we will need quantum operating systems and computing languages. This is complicated by the fact that quantum computers are still limited in size and not engineered to run operating systems and programming the way that we do in classical computers. Nevertheless, based on our understanding of how a quantum computer works, researchers have developed operating systems and programming languages that might be used once a quantum computer of sufficient power and functionality is able to run them.

Cloud computing and networking rely on other software technologies such as hypervisors, which manage how a computer is divided up into several virtual machines, and routing protocols to send data over the network. In fact, research is underway to develop each of these for the quantum internet. With quantum computer operating systems still under development, it is difficult to develop a hypervisor to run multiple operating systems on the same quantum computer as a classical hypervisor would.

By understanding the physical architecture of quantum computers, however, one can start to imagine how it might be organised to support different subsets of qubits to effectively run as separate quantum computers, potentially using different physical qubit technologies and employing different sub-architectures, within a single machine.

One important difference between quantum and classical computers and networks is that quantum computers can make use of classical computers to perform many of their functions. In fact, a quantum computer in itself is a tremendous feat of classical system engineering with many complex controls to set up and operate the quantum computations. This is a very different starting point from classical computers.

The same can be said for quantum networks, which have the classical internet to provide control functions to manage the network operations. It is likely that we will rely on classical computers and networks to operate their quantum analogues for some time. Just as a computer motherboard has many other types of electronics other than the microprocessor chip, it is likely that quantum computers will continue to rely on classical processors to do much of the mundane work behind their operation.

With the advent of the quantum internet, it is presumable that a quantum-signalling-equipped control plane might be able to support certain quantum network functions even more efficiently.

When talking about quantum computers and networks, scientists often refer to fault-tolerant operations. Fault tolerance is a particularly important step toward realising quantum cloud computing. Without fault tolerance, quantum operations are essentially single-shot computations that are initialised and then run to a stopping point that is limited by the accumulation of errors due to quantum memory lifetimes expiring as well as the noise that enters the system with each step in the computation.

Fault tolerance would allow for quantum operations to continue indefinitely with each result of a computation feeding the next. This is essential, for example, to run a computer operating system.

In the case of networks, loss and noise limit the distance that qubits can be transported on the order of 100km today. Fault tolerance through operations such as quantum error correction would allow for quantum networks to extend around the world. This is quite difficult for quantum networks because, unlike classical networks, quantum signals cannot be amplified.

We use amplifiers everywhere in classical networks to boost signals that are reduced due to losses, for example, from traveling down an optical fibre. If we boost a qubit signal with an optical amplifier, we would destroy its quantum properties. Instead, we need to build quantum repeaters to overcome signal losses and noise.

Together we have our sights set on realising the networks that will make up the quantum internet

If we can connect two fault-tolerant quantum computers at a distance that is less than the loss limits for the qubits, then the quantum error correction capabilities in the computers can in principle recover the quantum signal. If we build a chain of such quantum computers each passing quantum information to the next, then we can achieve the fault-tolerant quantum network that we need. This chain of computers linking together is reminiscent of the early classical internet when computers were used to route packets through the network. Today we use packet routers instead.

If you look under the hood of a packet router, it is composed of many powerful microprocessors that have replaced the computer routers and are much more efficient at the specific routing tasks involved. Thus, one might imagine a quantum analogue to the packet router, which would be a small purpose-built quantum computer designed for recovering and transmitting qubits through the network. These are what we refer to today as quantum repeaters, and with these quantum repeaters we could build a global quantum internet.

Currently there is much work underway to realise a fault-tolerant quantum repeater. Recently a team in the NSF Center for Quantum Networks (CQN)achieved an important milestone in that they were able to use a quantum memory to transmit a qubit beyond its usual loss limit. This is a building block for a quantum repeater. The SFI Connect Centre in Ireland is also working on classical network control systems that can be used to operate a network of such repeaters.

Together we have our sights set on realising the networks that will make up the quantum internet.

By Dan Kilper and Saikat Guha

Dan Kilper is professor of future communication networks at Trinity College Dublin and director of the Science Foundation Ireland (SFI) Connect research centre.

Saikat Guha is director of the NSF-ERC Center for Quantum Networks and professor of optical sciences, electrical and computer engineering, and applied mathematics at the University of Arizona.

See the original post here:
Looking to the future of quantum cloud computing - Siliconrepublic.com - Siliconrepublic.com

Swedish university is behind quantum computing breakthrough – ComputerWeekly.com

Swedens Chalmers University of Technology has achieved a quantum computing efficiency breakthrough through a novel type of thermometer that is capable of simplifying and rapidly measuring temperatures during quantum calculations.

The discovery adds a more advanced benchmarking tool that will accelerate Chalmers work in quantum computing development.

The novel thermometer is the latest innovation to emerge from the universitys research to develop an advanced quantum computer. The so-called OpenSuperQ project at Chalmers is coordinated with technology research organisation the Wallenberg Centre for Quantum Technology (WACQT), which is the OpenSuperQ projects main technology partner.

WACQT has set the goal of building a quantum computer capable of performing precise calculations by 2030. The technical requirements behind this ambitious target are based on superconducting circuits and developing aquantum computer with at least 100 well-functioning qubits. To realise this ambition, the OpenSuperQ project will require a processor working temperature close to absolute zero, ideally as low as 10 millikelvin (-273.14 C).

Headquartered at Chalmers Universitys research hub in Gothenburg, the OpenSuperQ project, launched in 2018, is intended to run until 2027. Working alongside the university in Gothenburg, WACQT is also operating support projects being run at the Royal Institute of Technology (Kungliga Tekniska Hgskolan) in Stockholm and collaborating universities in Lund, Stockholm, Linkping and Gothenburg.

Pledged capital funding for the WACQT-managed OpenSuperQ project which has been committed by the Knut and Alice Wallenberg Foundation together with 20 other private corporations in Sweden, currently amounts to SEK1.3bn (128m). In March, the foundation scaled up its funding commitment to WACQT, doubling its annual budget to SEK80m over the next four years.

The increased funding by the foundation will lead to the expansion of WACQTs QC research team, and the organisation is looking to recruit a further 40 researchers for the OpenSuperQ project in 2021-2022. A new team is to be established to study nanophotonic devices, which can enable the interconnection of several smaller quantum processors into a large quantum computer.

The Wallenberg sphere incorporates 16 public and private foundations operated by various family members. Each year, these foundations allocate about SEK2.5bn to research projects in the fields of technology, natural sciences and medicine in Sweden.

The OpenSuperQ project aims to take Sweden to the forefront of quantum technologies, including computing, sensing, communications and simulation, said Peter Wallenberg, chairman of the Knut and Alice Wallenberg Foundation.

Quantum technology has enormous potential, so it is vital that Sweden has the necessary expertise in this area. WACQT has built up a qualified research environment and established collaborations with Swedish industry. It has succeeded in developing qubits with proven problem-solving ability. We can move ahead with great confidence in what WACQT will go on to achieve.

The novel thermometer breakthrough opens the door to experiments in the dynamic field of quantum thermodynamics, said Simone Gasparinetti, assistant professor at Chalmers quantum technology laboratory.

Our thermometer is a superconducting circuit and directly connected to the end of the waveguide being measured, said Gasparinetti. It is relatively simple and probably the worlds fastest and most sensitive thermometer for this particular purpose at the millikelvin scale.

Coaxial cables and waveguides the structures that guide waveforms and serve as the critical connection to the quantum processor remain key components in quantum computers. The microwave pulses that travel down the waveguides to the quantum processor are cooled to extremely low temperatures along the way.

For researchers, a fundamental goal is to ensure that these waveguides are not carrying noise due to the thermal motion of electrons on top of the pulses that they send. Precise temperature measurement readings of the electromagnetic fields are needed at the cold end of the microwave waveguides, the point where the controlling pulses are delivered to the computers qubits.

Working at the lowest possible temperature minimises the risk of introducing errors in the qubits. Until now, researchers have only been able to measure this temperature indirectly, and with relatively long delays. Chalmers Universitys novel thermometer enables very low temperatures to be measured directly at the receiving end of the waveguide with elevated accuracy and with extremely high time resolution.

The novel thermometer developed at the university provides researchers with a value-added tool to measure the efficiency of systems while identifying possible shortcomings, said Per Delsing, a professor at the department of microtechnology and nanoscience at Chalmers and director of WACQT.

A certain temperature corresponds to a given number of thermal photons, and that number decreases exponentially with temperature, he said. If we succeed in lowering the temperature at the end where the waveguide meets the qubit to 10 millikelvin, the risk of errors in our qubits is reduced drastically.

The universitys primary role in the OpenSuperQ project is to lead the work on developing the application algorithms that will be executed on the OpenSuperQ quantum computer. It will also support the development of algorithms for quantum chemistry, optimisation and machine learning.

Also, Chalmers will head up efforts to improve quantum coherence in chips with multiple coupled qubits, including device design, process development, fabrication, packaging and testing. It will also conduct research to evaluate the performance of 2-qubit gates and develop advanced qubit control methods to mitigate systematic and incoherent errors to achieve targeted gate fidelities.

The rest is here:
Swedish university is behind quantum computing breakthrough - ComputerWeekly.com

A Tectonic Shift in Analytics and Computing Is Coming – Eos

More than 50 years ago, a fundamental scientific revolution occurred, sparked by the concurrent emergence of a huge amount of new data on seafloor bathymetry and profound intellectual insights from researchers rethinking conventional wisdom. Data and insight combined to produce the paradigm of plate tectonics. Similarly, in the coming decade, a new revolution in data analytics may rapidly overhaul how we derive knowledge from data in the geosciences. Two interrelated elements will be central in this process: artificial intelligence (AI, including machine learning methods as a subset) and high-performance computing (HPC).

Already today, geoscientists must understand modern tools of data analytics and the hardware on which they work. Now AI and HPC, along with cloud computing and interactive programming languages, are becoming essential tools for geoscientists. Here we discuss the current state of AI and HPC in Earth science and anticipate future trends that will shape applications of these developing technologies in the field. We also propose that it is time to rethink graduate and professional education to account for and capitalize on these quickly emerging tools.

Great strides in AI capabilities, including speech and facial recognition, have been made over the past decade, but the origins of these capabilities date back much further. In 1971, the U.S. Defense Advanced Research Projects Agency substantially funded a project calledSpeech Understanding Research, and it was generally believed at the time that artificial speech recognition was just around the corner. We know now that this was not the case, as todays speech and writing recognition capabilities emerged only as a result of both vastly increased computing power and conceptual breakthroughs such as the use of multilayered neural networks, which mimic the biological structure of the brain.

Artificial intelligence (AI) and many other artificial computing tools are still in their infancy, which has important implications for high-performance computing (HPC) in the geosciences.Recently, AI has gained the ability to create images of artificial faces that humans cannot distinguish from real ones by using generative adversarial networks (GANs). These networks combine two neural networks, one that produces a model and a second one that tries to discriminate the generated model from the real one. Scientists have now started to use GANs to generate artificial geoscientific data sets.

These and other advances are striking, yet AI and many other artificial computing tools are still in their infancy. We cannot predict what AI will be able to do 2030 years from now, but a survey of existing AI applications recently showed that computing power is the key when targeting practical applications today. The fact that AI is still in its early stages has important implications for HPC in the geosciences. Currently, geoscientific HPC studies have been dominated by large-scale time-dependent numerical simulations that use physical observations to generate models [Morra et al, 2021a]. In the future, however, we may work in the other directionEarth, ocean, and atmospheric simulations may feed large AI systems that in turn produce artificial data sets that allow geoscientific investigations, such as Destination Earth, for which collected data are insufficient.

Development of AI capabilities is well underway in certain geoscience disciplines. For a decade now [Ma et al., 2019], remote sensing operations have been using convolutional neural networks (CNNs), a kind of neural network that adaptively learns which features to look at in a data set. In seismology (Figure 1), pattern recognition is the most common application of machine learning (ML), and recently, CNNs have been trained to find patterns in seismic data [Kong et al., 2019], leading to discoveries such as previously unrecognized seismic events [Bergen et al., 2019].

New AI applications and technologies are also emerging; these involve, for example, the self-ordering of seismic waveforms to detect structural anomalies in the deep mantle [Kim et al., 2020]. Recently, deep generative models, which are based on neural networks, have shown impressive capabilities in modeling complex natural signals, with the most promising applications in autoencoders and GANs (e.g., for generating images from data).

CNNs are a form of supervised machine learning (SML), meaning that before they are applied for their intended use, they are first trained to find prespecified patterns in labeled data sets and to check their accuracy against an answer key. Training a neural network using SML requires large, well-labeled data sets as well as massive computing power. Massive computing power, in turn, requires massive amounts of electricity, such that the energy demand of modern AI models is doubling every 3.4 months and causing a large and growing carbon footprint.

AI is starting to improve the efficiency of geophysical sensors: Some sensors use AI to detect when interesting data are recorded, and these data are selectively stored.In the future, the trend in geoscientific applications of AI might shift from using bigger CNNs to using more scalable algorithms that can improve performance with less training data and fewer computing resources. Alternative strategies will likely involve less energy-intensive neural networks, such as spiking neural networks, which reduce data inputs by analyzing discrete events rather than continuous data streams.

Unsupervised ML (UML), in which an algorithm identifies patterns on its own rather than searching for a user-specified pattern, is another alternative to data-hungry SML. One type of UML identifies unique features in a data set to allow users to discover anomalies of interest (e.g., evidence of hidden geothermal resources in seismic data) and to distinguish trends of interest (e.g., rapidly versus slowly declining production from oil and gas wells based on production rate transients) [Vesselinov et al., 2019].

AI is also starting to improve the efficiency of geophysical sensors. Data storage limitations require instruments such as seismic stations, acoustic sensors, infrared cameras, and remote sensors to record and save data sets that are much smaller than the total amount of data they measure. Some sensors use AI to detect when interesting data are recorded, and these data are selectively stored. Sensor-based AI algorithms also help minimize energy consumption by and prolong the life of sensors located in remote regions, which are difficult to service and often powered by a single solar panel. These techniques include quantized CNN (using 8-bit variables) running on minimal hardware, such as Raspberry Pi [Wilkes et al., 2017].

Powerful, efficient algorithms and software represent only one part of the data revolution; the hardware and networks that we use to process and store data have evolved significantly as well.

Since about 2004, when the increase in frequencies at which processors operate stalled at about 3 gigahertz (the end of Moores law), computing power has been augmented by increasing the number of cores per CPU and by the parallel work of cores in multiple CPUs, as in computing clusters.

Accelerators such as graphics processing units (GPUs), once used mostly for video games, are now routinely used for AI applications and are at the heart of all major ML facilities (as well the U.S. Exascale Strategy, a part of the National Strategic Computing Initiative). For example, Summit and Sierra, the two fastest supercomputers in the United States, are based on a hierarchical CPU-GPU architecture. Meanwhile, emerging tensor processing units, which were developed specifically for matrix-based operations, excel at the most demanding tasks of most neural network algorithms. In the future, computers will likely become increasingly heterogeneous, with a single system combining several types of processors, including specialized ML coprocessors (e.g., Cerebras) and quantum computing processors.

Computational systems that are physically distributed across remote locations and used on demand, usually called cloud computing, are also becoming more common, although these systems impose limitations on the code that can be run on them. For example, cloud infrastructures, in contrast to centralized HPC clusters and supercomputers, are not designed for performing large-scale parallel simulations. Cloud infrastructures face limitations on high-throughput interconnectivity, and the synchronization needed to help multiple computing nodes coordinate tasks is substantially more difficult to achieve for physically remote clusters. Although several cloud-based computing providers are now investing in high-throughput interconnectivity, the problem of synchronization will likely remain for the foreseeable future.

AI has proven invaluable in discovering and analyzing patterns in large, real-world data sets. It could also become a source of realistic artificial data sets.Artificial intelligence has proven invaluable in discovering and analyzing patterns in large, real-world data sets. It could also become a source of realistic artificial data sets, generated through models and simulations. Artificial data sets enable geophysicists to examine problems that are unwieldy or intractable using real-world databecause these data may be too costly or technically demanding to obtainand to explore what-if scenarios or interconnected physical phenomena in isolation. For example, simulations could generate artificial data to help study seismic wave propagation; large-scale geodynamics; or flows of water, oil, and carbon dioxide through rock formations to assist in energy extraction and storage.

HPC and cloud computing will help produce and run 3D models, not only assisting in improved visualization of natural processes but also allowing for investigation of processes that cant be adequately studied with 2D modeling. In geodynamics, for example, using 2D modeling makes it difficult to calculate 3D phenomena like toroidal flow and vorticity because flow patterns are radically different in 3D. Meanwhile, phenomena like crustal porosity waves (waves of high porosity in rocks; Figure 2) and corridors of fast-moving ice in glaciers require extremely high spatial and temporal resolutions in 3D to capture [Rss et al., 2020].

Fig. 2. A 3D modeling run with 16 billion degrees of freedom simulates flow focusing in porous media and identifies a pulsed behavior phenomenon called porosity waves. Credit: Rss et al. [2018], CC BY 4.0Adding an additional dimension to a model can require a significant increase in the amount of data processed. For example, in exploration seismology, going from a 2D to a 3D simulation involves a transition from requiring three-dimensional data (i.e., source, receiver, time) to five-dimensional data (source x, source y, receiver x, receiver y, and time [e.g., Witte et al., 2020]). AI can help with this transition. At the global scale, for example, the assimilation of 3D simulations in iterative full-waveform inversions for seismic imaging was performed recently with limited real-world data sets, employing AI techniques to maximize the amount of information extracted from seismic traces while maintaining the high quality of the data [Lei et al., 2020].

Interactive programming and language-agnostic programming environments are young techniques that will facilitate introducing computing to geoscientists.As far as weve come in developing AI for uses in geoscientific research, there is plenty of room for growth in the algorithms and computing infrastructure already mentioned, as well as in other developing technologies. For example, interactive programming, in which the programmer develops new code while a program is active, and language-agnostic programming environments that can run code in a variety of languages are young techniques that will facilitate introducing computing to geoscientists.

Programming languages, such as Python and Julia, which are now being taught to Earth science students, will accompany the transition to these new methods and will be used in interactive environments such as the Jupyter Notebook. Julia was shown recently to perform well as compiled code for machine learning algorithms in its most recent implementations, such as the ones using differentiable programming, which reduces computational resource and energy requirements.

Quantum computing, which uses the quantum states of atoms rather than streams of electrons to transmit data, is another promising development that is still in its infancy but that may lead to the next major scientific revolution. It is forecast that by the end of this decade, quantum computers will be applied in solving many scientific problems, including those related to wave propagation, crustal stresses, atmospheric simulations, and other topics in the geosciences. With competition from China in developing quantum technologies and AI, quantum computing and quantum information applications may become darlings of major funding opportunities, offering the means for ambitious geophysicists to pursue fundamental research.

Taking advantage of these new capabilities will, of course, require geoscientists who know how to use them. Today, many geoscientists face enormous pressure to requalify themselves for a rapidly changing job market and to keep pace with the growing complexity of computational technologies. Academia, meanwhile, faces the demanding task of designing innovative training to help students and others adapt to market conditions, although finding professionals who can teach these courses is challenging because they are in high demand in the private sector. However, such teaching opportunities could provide a point of entry for young scientists specializing in computer science or part-time positions for professionals retired from industry or national labs [Morra et al., 2021b].

The coming decade will see a rapid revolution in data analytics that will significantly affect the processing and flow of information in the geosciences. Artificial intelligence and high-performance computing are the two central elements shaping this new landscape. Students and professionals in the geosciences will need new forms of education enabling them to rapidly learn the modern tools of data analytics and predictive modeling. If done well, the concurrence of these new tools and a workforce primed to capitalize on them could lead to new paradigm-shifting insights that, much as the plate tectonic revolution did, help us address major geoscientific questions in the future.

The listed authors thank Peter Gerstoft, Scripps Institution of Oceanography, University of California, San Diego; Henry M. Tufo, University of Colorado Boulder; and David A. Yuen, Columbia University and Ocean University of China, Qingdao, who contributed equally to the writing of this article.

Continued here:
A Tectonic Shift in Analytics and Computing Is Coming - Eos

Toshiba and 10 other Japanese firms to form quantum tech tie-up – The Japan Times

Eleven Japanese companies, including Toshiba Corp., plan to jointly launch a council this summer in a bid to create a new industry using quantum technology, the companies said Monday.

The council will identify and discuss issues linked to quantum computers, quantum cryptography and other base technologies, as well as those on related human resources and rules, with an aim to explore the possibility of industrializing quantum technology, which is widely expected to play a key role in national security.

Quantum computers have much higher computing capabilities than conventional computers, while quantum cryptography theoretically makes wiretapping impossible.

With the council, the 11 companies, which also include Toyota Motor Corp. and Nippon Telegraph and Telephone Corp., hope to boost their presence at a time when U.S. and Chinese players are vying with each other in the research and development of quantum technology.

This is an all-Japan system aimed at making the country a world innovation leader in quantum technology, Toshiba President Satoshi Tsunakawa said at the councils foundation meeting on Monday.

Also on Monday, online flea market operator Mercari Inc. said it has set up an organization for the research and development of quantum internet, jointly with Keio University and the University of Tokyo.

Quantum internet, which involves the exchange of quantum data, allows users to have safe communication, protecting them from prying eyes.

Mercari plans to encourage other companies to join the organization. Shota Nagayama, a senior researcher at Mercari, said, We want more firms to join our efforts to accumulate knowledge.

The new organization aims to put quantum internet into practical use on a trial basis in 15 years.

These moves highlight that industry-government-academia cooperation is gaining momentum in Japan over the research and development of quantum technology, which is applicable to many sorts of businesses.

In a time of both misinformation and too much information, quality journalism is more crucial than ever.By subscribing, you can help us get the story right.

PHOTO GALLERY (CLICK TO ENLARGE)

View post:
Toshiba and 10 other Japanese firms to form quantum tech tie-up - The Japan Times