A Tectonic Shift in Analytics and Computing Is Coming – Eos
More than 50 years ago, a fundamental scientific revolution occurred, sparked by the concurrent emergence of a huge amount of new data on seafloor bathymetry and profound intellectual insights from researchers rethinking conventional wisdom. Data and insight combined to produce the paradigm of plate tectonics. Similarly, in the coming decade, a new revolution in data analytics may rapidly overhaul how we derive knowledge from data in the geosciences. Two interrelated elements will be central in this process: artificial intelligence (AI, including machine learning methods as a subset) and high-performance computing (HPC).
Already today, geoscientists must understand modern tools of data analytics and the hardware on which they work. Now AI and HPC, along with cloud computing and interactive programming languages, are becoming essential tools for geoscientists. Here we discuss the current state of AI and HPC in Earth science and anticipate future trends that will shape applications of these developing technologies in the field. We also propose that it is time to rethink graduate and professional education to account for and capitalize on these quickly emerging tools.
Great strides in AI capabilities, including speech and facial recognition, have been made over the past decade, but the origins of these capabilities date back much further. In 1971, the U.S. Defense Advanced Research Projects Agency substantially funded a project calledSpeech Understanding Research, and it was generally believed at the time that artificial speech recognition was just around the corner. We know now that this was not the case, as todays speech and writing recognition capabilities emerged only as a result of both vastly increased computing power and conceptual breakthroughs such as the use of multilayered neural networks, which mimic the biological structure of the brain.
Artificial intelligence (AI) and many other artificial computing tools are still in their infancy, which has important implications for high-performance computing (HPC) in the geosciences.Recently, AI has gained the ability to create images of artificial faces that humans cannot distinguish from real ones by using generative adversarial networks (GANs). These networks combine two neural networks, one that produces a model and a second one that tries to discriminate the generated model from the real one. Scientists have now started to use GANs to generate artificial geoscientific data sets.
These and other advances are striking, yet AI and many other artificial computing tools are still in their infancy. We cannot predict what AI will be able to do 2030 years from now, but a survey of existing AI applications recently showed that computing power is the key when targeting practical applications today. The fact that AI is still in its early stages has important implications for HPC in the geosciences. Currently, geoscientific HPC studies have been dominated by large-scale time-dependent numerical simulations that use physical observations to generate models [Morra et al, 2021a]. In the future, however, we may work in the other directionEarth, ocean, and atmospheric simulations may feed large AI systems that in turn produce artificial data sets that allow geoscientific investigations, such as Destination Earth, for which collected data are insufficient.
Development of AI capabilities is well underway in certain geoscience disciplines. For a decade now [Ma et al., 2019], remote sensing operations have been using convolutional neural networks (CNNs), a kind of neural network that adaptively learns which features to look at in a data set. In seismology (Figure 1), pattern recognition is the most common application of machine learning (ML), and recently, CNNs have been trained to find patterns in seismic data [Kong et al., 2019], leading to discoveries such as previously unrecognized seismic events [Bergen et al., 2019].
New AI applications and technologies are also emerging; these involve, for example, the self-ordering of seismic waveforms to detect structural anomalies in the deep mantle [Kim et al., 2020]. Recently, deep generative models, which are based on neural networks, have shown impressive capabilities in modeling complex natural signals, with the most promising applications in autoencoders and GANs (e.g., for generating images from data).
CNNs are a form of supervised machine learning (SML), meaning that before they are applied for their intended use, they are first trained to find prespecified patterns in labeled data sets and to check their accuracy against an answer key. Training a neural network using SML requires large, well-labeled data sets as well as massive computing power. Massive computing power, in turn, requires massive amounts of electricity, such that the energy demand of modern AI models is doubling every 3.4 months and causing a large and growing carbon footprint.
AI is starting to improve the efficiency of geophysical sensors: Some sensors use AI to detect when interesting data are recorded, and these data are selectively stored.In the future, the trend in geoscientific applications of AI might shift from using bigger CNNs to using more scalable algorithms that can improve performance with less training data and fewer computing resources. Alternative strategies will likely involve less energy-intensive neural networks, such as spiking neural networks, which reduce data inputs by analyzing discrete events rather than continuous data streams.
Unsupervised ML (UML), in which an algorithm identifies patterns on its own rather than searching for a user-specified pattern, is another alternative to data-hungry SML. One type of UML identifies unique features in a data set to allow users to discover anomalies of interest (e.g., evidence of hidden geothermal resources in seismic data) and to distinguish trends of interest (e.g., rapidly versus slowly declining production from oil and gas wells based on production rate transients) [Vesselinov et al., 2019].
AI is also starting to improve the efficiency of geophysical sensors. Data storage limitations require instruments such as seismic stations, acoustic sensors, infrared cameras, and remote sensors to record and save data sets that are much smaller than the total amount of data they measure. Some sensors use AI to detect when interesting data are recorded, and these data are selectively stored. Sensor-based AI algorithms also help minimize energy consumption by and prolong the life of sensors located in remote regions, which are difficult to service and often powered by a single solar panel. These techniques include quantized CNN (using 8-bit variables) running on minimal hardware, such as Raspberry Pi [Wilkes et al., 2017].
Powerful, efficient algorithms and software represent only one part of the data revolution; the hardware and networks that we use to process and store data have evolved significantly as well.
Since about 2004, when the increase in frequencies at which processors operate stalled at about 3 gigahertz (the end of Moores law), computing power has been augmented by increasing the number of cores per CPU and by the parallel work of cores in multiple CPUs, as in computing clusters.
Accelerators such as graphics processing units (GPUs), once used mostly for video games, are now routinely used for AI applications and are at the heart of all major ML facilities (as well the U.S. Exascale Strategy, a part of the National Strategic Computing Initiative). For example, Summit and Sierra, the two fastest supercomputers in the United States, are based on a hierarchical CPU-GPU architecture. Meanwhile, emerging tensor processing units, which were developed specifically for matrix-based operations, excel at the most demanding tasks of most neural network algorithms. In the future, computers will likely become increasingly heterogeneous, with a single system combining several types of processors, including specialized ML coprocessors (e.g., Cerebras) and quantum computing processors.
Computational systems that are physically distributed across remote locations and used on demand, usually called cloud computing, are also becoming more common, although these systems impose limitations on the code that can be run on them. For example, cloud infrastructures, in contrast to centralized HPC clusters and supercomputers, are not designed for performing large-scale parallel simulations. Cloud infrastructures face limitations on high-throughput interconnectivity, and the synchronization needed to help multiple computing nodes coordinate tasks is substantially more difficult to achieve for physically remote clusters. Although several cloud-based computing providers are now investing in high-throughput interconnectivity, the problem of synchronization will likely remain for the foreseeable future.
AI has proven invaluable in discovering and analyzing patterns in large, real-world data sets. It could also become a source of realistic artificial data sets.Artificial intelligence has proven invaluable in discovering and analyzing patterns in large, real-world data sets. It could also become a source of realistic artificial data sets, generated through models and simulations. Artificial data sets enable geophysicists to examine problems that are unwieldy or intractable using real-world databecause these data may be too costly or technically demanding to obtainand to explore what-if scenarios or interconnected physical phenomena in isolation. For example, simulations could generate artificial data to help study seismic wave propagation; large-scale geodynamics; or flows of water, oil, and carbon dioxide through rock formations to assist in energy extraction and storage.
HPC and cloud computing will help produce and run 3D models, not only assisting in improved visualization of natural processes but also allowing for investigation of processes that cant be adequately studied with 2D modeling. In geodynamics, for example, using 2D modeling makes it difficult to calculate 3D phenomena like toroidal flow and vorticity because flow patterns are radically different in 3D. Meanwhile, phenomena like crustal porosity waves (waves of high porosity in rocks; Figure 2) and corridors of fast-moving ice in glaciers require extremely high spatial and temporal resolutions in 3D to capture [Rss et al., 2020].
Fig. 2. A 3D modeling run with 16 billion degrees of freedom simulates flow focusing in porous media and identifies a pulsed behavior phenomenon called porosity waves. Credit: Rss et al. [2018], CC BY 4.0Adding an additional dimension to a model can require a significant increase in the amount of data processed. For example, in exploration seismology, going from a 2D to a 3D simulation involves a transition from requiring three-dimensional data (i.e., source, receiver, time) to five-dimensional data (source x, source y, receiver x, receiver y, and time [e.g., Witte et al., 2020]). AI can help with this transition. At the global scale, for example, the assimilation of 3D simulations in iterative full-waveform inversions for seismic imaging was performed recently with limited real-world data sets, employing AI techniques to maximize the amount of information extracted from seismic traces while maintaining the high quality of the data [Lei et al., 2020].
Interactive programming and language-agnostic programming environments are young techniques that will facilitate introducing computing to geoscientists.As far as weve come in developing AI for uses in geoscientific research, there is plenty of room for growth in the algorithms and computing infrastructure already mentioned, as well as in other developing technologies. For example, interactive programming, in which the programmer develops new code while a program is active, and language-agnostic programming environments that can run code in a variety of languages are young techniques that will facilitate introducing computing to geoscientists.
Programming languages, such as Python and Julia, which are now being taught to Earth science students, will accompany the transition to these new methods and will be used in interactive environments such as the Jupyter Notebook. Julia was shown recently to perform well as compiled code for machine learning algorithms in its most recent implementations, such as the ones using differentiable programming, which reduces computational resource and energy requirements.
Quantum computing, which uses the quantum states of atoms rather than streams of electrons to transmit data, is another promising development that is still in its infancy but that may lead to the next major scientific revolution. It is forecast that by the end of this decade, quantum computers will be applied in solving many scientific problems, including those related to wave propagation, crustal stresses, atmospheric simulations, and other topics in the geosciences. With competition from China in developing quantum technologies and AI, quantum computing and quantum information applications may become darlings of major funding opportunities, offering the means for ambitious geophysicists to pursue fundamental research.
Taking advantage of these new capabilities will, of course, require geoscientists who know how to use them. Today, many geoscientists face enormous pressure to requalify themselves for a rapidly changing job market and to keep pace with the growing complexity of computational technologies. Academia, meanwhile, faces the demanding task of designing innovative training to help students and others adapt to market conditions, although finding professionals who can teach these courses is challenging because they are in high demand in the private sector. However, such teaching opportunities could provide a point of entry for young scientists specializing in computer science or part-time positions for professionals retired from industry or national labs [Morra et al., 2021b].
The coming decade will see a rapid revolution in data analytics that will significantly affect the processing and flow of information in the geosciences. Artificial intelligence and high-performance computing are the two central elements shaping this new landscape. Students and professionals in the geosciences will need new forms of education enabling them to rapidly learn the modern tools of data analytics and predictive modeling. If done well, the concurrence of these new tools and a workforce primed to capitalize on them could lead to new paradigm-shifting insights that, much as the plate tectonic revolution did, help us address major geoscientific questions in the future.
The listed authors thank Peter Gerstoft, Scripps Institution of Oceanography, University of California, San Diego; Henry M. Tufo, University of Colorado Boulder; and David A. Yuen, Columbia University and Ocean University of China, Qingdao, who contributed equally to the writing of this article.
Continued here:
A Tectonic Shift in Analytics and Computing Is Coming - Eos
- Quantum Computing Can Be Brought to the Masses, if It Is Decentralized - CCN.com - January 9th, 2025 [January 9th, 2025]
- Why Quantum Computing Specialist IonQ (IONQ) May Have Reached The End Of The Road - Barchart - January 9th, 2025 [January 9th, 2025]
- Nvidia CEO Jensen Huang just tanked quantum-computing stocks after saying their most exciting developments are more than a decade away - Fortune - January 9th, 2025 [January 9th, 2025]
- Quantum Computing Stocks Sink as Nvidia CEO Says Tech Is 15 to 30 Years Away - Investopedia - January 9th, 2025 [January 9th, 2025]
- Why Quantum Computing Stocks Rigetti Computing, Quantum Computing, and D-Wave Computing All Plunged Today - The Motley Fool - January 9th, 2025 [January 9th, 2025]
- Quantum Computing Stocks Crashed -- Here's Why - The Motley Fool - January 9th, 2025 [January 9th, 2025]
- Nvidia CEO Jen-Hsun Huang's simple reminder that useful quantum computing is a long way off has somehow caused industry stocks to plummet - PC Gamer - January 9th, 2025 [January 9th, 2025]
- How Quantum Computing Could Advance One Health - Impakter - January 9th, 2025 [January 9th, 2025]
- Quantum computing stocks are having a rough start to 2025: IonQ, D-Wave, Rigetti tank after Nvidia CEO predicts 20-year horizon - Fast Company - January 9th, 2025 [January 9th, 2025]
- Quantum Computing, Inc. Announces Private Placement of Common Stock for Proceeds of $100 Million - Yahoo Finance - January 9th, 2025 [January 9th, 2025]
- 2025 will see huge advances in quantum computing. So what is a quantum chip and how does it work? - The Conversation - January 9th, 2025 [January 9th, 2025]
- Nvidia CEO Jensen Huang just tanked quantum-computing stocks after saying their most exciting developments are more than a decade away - AOL - January 9th, 2025 [January 9th, 2025]
- Collaboration to explore the use of graphene technology in quantum computing - The Manufacturer - January 9th, 2025 [January 9th, 2025]
- Quantum computing stocks tumble after Nvidia boss Jensen Huang says the tech is still 20 years away - Markets Insider - January 9th, 2025 [January 9th, 2025]
- Want to Buy a Quantum Computing Stock in 2025? You Might Consider This Quantum Computing ETF. - The Motley Fool - January 9th, 2025 [January 9th, 2025]
- Ride the Quantum Computing Wave with These 2 Stocks: RGTI, QBTS - Yahoo Finance - January 9th, 2025 [January 9th, 2025]
- Shaping the Future of Quantum Computing in the United Arab Emirates (UAE) - Quantum Computing Report - January 9th, 2025 [January 9th, 2025]
- How Nvidia CEO Jensen Huang's one sentence wiped out $8 billion in market cap of quantum computing compan - The Times of India - January 9th, 2025 [January 9th, 2025]
- Will This Quantum Computing Stock Be a Must-Own in 2025? - The Motley Fool - January 9th, 2025 [January 9th, 2025]
- Quantum-computing stocks tumble on Nvidia CEOs comment that theyre decades away from being very useful - Sherwood News - January 9th, 2025 [January 9th, 2025]
- Analyzing Quantum Computing Has Been The Most Challenging Project In My Career (NASDAQ:QUBT) - Seeking Alpha - January 3rd, 2025 [January 3rd, 2025]
- Norma and Mabel Quantum Partner to Launch Integrated Quantum Computing System in Korea - Quantum Computing Report - January 3rd, 2025 [January 3rd, 2025]
- How Microsoft and Partners are Shaping the Future of Quantum Computing - The Quantum Insider - January 3rd, 2025 [January 3rd, 2025]
- One Quantum Computing ETF to Buy Hand Over Fist as Googles Willow Supercharges the Market - Barchart - January 3rd, 2025 [January 3rd, 2025]
- MicroCloud Hologram Inc. Develops Semiconductor Quantum Dot Hole Spin Qubit Technology, Advancing the Frontiers of Quantum Computing - Yahoo Finance - January 3rd, 2025 [January 3rd, 2025]
- Quantum Applications in the Automotive Industry - Quantum Computing Report - January 3rd, 2025 [January 3rd, 2025]
- Jim Cramer Warns 'Day Is Not Near Enough To Justify The Current Valuations' Of Quantum Computing, Nuclear Power Stocks - Benzinga - January 3rd, 2025 [January 3rd, 2025]
- MicroCloud Hologram's Stock Surges 31% on Quantum Computing Breakthrough: What This Means for the Future of Tech - The Africa Logistics - January 3rd, 2025 [January 3rd, 2025]
- Quantum Computing Stocks Like Rigetti Computing Are Soaring And This ETF Lets Investors Participate In The Boom Story - Benzinga - January 3rd, 2025 [January 3rd, 2025]
- Future Industry Growth Of Commercial Quantum Computing - openPR - January 3rd, 2025 [January 3rd, 2025]
- GCAN to Explore Strategic Alternatives in Artificial Intelligence and Quantum Computing - GlobeNewswire - January 3rd, 2025 [January 3rd, 2025]
- Jim Cramer talks being cautious with nuclear power and quantum computing stocks - MSN - January 3rd, 2025 [January 3rd, 2025]
- Quantum Computing Is Finally Here. But What Is It? - Bloomberg - December 27th, 2024 [December 27th, 2024]
- Should You Buy Quantum Computing Stocks in 2025? - The Motley Fool - December 27th, 2024 [December 27th, 2024]
- Rigetti Stock Doubles in Days: Here's the Quantum Computing Stock's Next Target - Money Morning - December 27th, 2024 [December 27th, 2024]
- 3 Quantum Computing Stocks Surging to End the Year - Schaeffers Research - December 27th, 2024 [December 27th, 2024]
- Quantum Computing Advances in 2024 Put Security In Spotlight - Dark Reading - December 27th, 2024 [December 27th, 2024]
- Daejeon City Partners with Norma and National Nanofab Center to Advance Quantum Computing - Quantum Computing Report - December 27th, 2024 [December 27th, 2024]
- Why IonQ Is the Best Quantum Computing Stock to Buy Right Now - The Motley Fool - December 27th, 2024 [December 27th, 2024]
- Singapore Startup's Quantum Controller Aimed at Bridging the Gap Between Traditional and Quantum Computing - The Quantum Insider - December 27th, 2024 [December 27th, 2024]
- 2 Quantum Computing Stocks Poised for Big Gains: Get Their Price Targets Here - Money Morning - December 27th, 2024 [December 27th, 2024]
- SCIENCE NOTEBOOK | More Efficient Quantum Computing, Aggressive Lowering of BP of Type 2 Diabetes Patients, and Heat-Related Mortality Due to Climate... - December 27th, 2024 [December 27th, 2024]
- Rigetti Computing leads quantum stocks higher to end week - Seeking Alpha - December 27th, 2024 [December 27th, 2024]
- Quantum Computing Stock QUBT Has More Than Doubled While Bitcoin Has Dropped Since Google's 'Willow' Reveal: What Does This Mean? - Benzinga - December 27th, 2024 [December 27th, 2024]
- Three Ways Nvidia (NVDA) Benefits From The Quantum Computing Revolution - Yahoo Finance - December 27th, 2024 [December 27th, 2024]
- Quantum Stocks: Avoid Rigetti Computing And Buy IonQ Instead (NYSE:IONQ) - Seeking Alpha - December 27th, 2024 [December 27th, 2024]
- SEALSQ Secures $60.0 Million in Total Funding to Advance Post-Quantum Cryptography Semiconductor Technology - Quantum Computing Report - December 27th, 2024 [December 27th, 2024]
- Quantum Computing Shares Soar! Investors Eye the Future. - Jomfruland.net - December 27th, 2024 [December 27th, 2024]
- What Googles quantum computing breakthrough Willow means for the future of bitcoin and other cryptos - CNBC - December 22nd, 2024 [December 22nd, 2024]
- Quantum computing will fortify Bitcoin signatures: Adam Back - Cointelegraph - December 22nd, 2024 [December 22nd, 2024]
- Quantum Computing: The New AI? A Look at the Rapidly Expanding Market and Top Stocks For 2025 - Benzinga - December 22nd, 2024 [December 22nd, 2024]
- D-Wave Quantum (QBTS) Riding High on the Quantum Computing Tide - TipRanks - December 22nd, 2024 [December 22nd, 2024]
- Quantum Computing, BlackBerry And Lucid Group Are Among Top Mid Cap Gainers Last Week (December 16-20): Are The Others In Your Portfolio? - Benzinga - December 22nd, 2024 [December 22nd, 2024]
- Quantum computing stocks are having a great 2024: QUBT, D-Wave, Rigetti soar on enthusiasm for the cutting-edge tech - Fast Company - December 22nd, 2024 [December 22nd, 2024]
- IBMs stock could ride the coattails of the quantum-computing rally. Heres how. - MarketWatch - December 22nd, 2024 [December 22nd, 2024]
- Quantum Computing Stock Skyrockets Further on NASA Contract - Investopedia - December 22nd, 2024 [December 22nd, 2024]
- Is the Quantum Computing Stock Rally Over So Soon? - TipRanks - December 22nd, 2024 [December 22nd, 2024]
- Quantum computing stocks mixed as eye-popping rally slows a bit - Seeking Alpha - December 22nd, 2024 [December 22nd, 2024]
- Bitcoin would need over 300 days of downtime to adequately defend itself from the 'imminent' threat of quantum computing, research finds - Fortune - December 22nd, 2024 [December 22nd, 2024]
- Rigetti Stock Investors: Here's What You Need to Know About This Quantum Computing Stock - The Motley Fool - December 22nd, 2024 [December 22nd, 2024]
- 2 Top Stocks in Quantum Computing and Robotics That Could Soar in 2025 - Yahoo Finance - December 22nd, 2024 [December 22nd, 2024]
- New day dawns for quantum computing in the UK - physicsworld.com - December 22nd, 2024 [December 22nd, 2024]
- What's Going On With Quantum Computing (QUBT) Stock? - Benzinga - December 22nd, 2024 [December 22nd, 2024]
- Quantum Computing Stock Investors: Here's What You Need to Know - The Motley Fool - December 22nd, 2024 [December 22nd, 2024]
- Quantum Computing Is Coming And Lawyers Arent Ready - Above the Law - December 22nd, 2024 [December 22nd, 2024]
- 2024: The Year of Quantum Computing Roadmaps - Quantum Computing Report - December 22nd, 2024 [December 22nd, 2024]
- The Future is Here. Unlocking the Mysteries of Quantum Computing. - Qhubo - December 22nd, 2024 [December 22nd, 2024]
- 2 Top Stocks in Quantum Computing and Robotics That Could Soar in 2025 - The Motley Fool - December 22nd, 2024 [December 22nd, 2024]
- Quantum walk computing unlocks new potential in quantum science and technology - MSN - December 22nd, 2024 [December 22nd, 2024]
- Investing in the Future of Quantum Computing: Stocks to Watch Now - MarketBeat - December 22nd, 2024 [December 22nd, 2024]
- Quantum Computing Inches Closer to Reality After Another Google Breakthrough - The New York Times - December 14th, 2024 [December 14th, 2024]
- How Google's Willow is A Quantum Leap in Computing Tech - Technology Magazine - December 14th, 2024 [December 14th, 2024]
- Google claims quantum computing milestone but the tech can't solve real-world problems yet - CNBC - December 14th, 2024 [December 14th, 2024]
- Ten septillion years: Google makes another quantum computing breakthrough - Semafor - December 14th, 2024 [December 14th, 2024]
- BMW Group and Airbus reveal winners of Quantum Computing Challenge - BMW Press - December 14th, 2024 [December 14th, 2024]
- The Race for Fault-Tolerant Quantum Computing: Unveiling the Next Leap | by Disruptive Concepts | Dec, 2024 - Medium - December 14th, 2024 [December 14th, 2024]
- Can the Rally in Alphabet (GOOGL) Stock Continue with New Quantum Computing Chip? - Yahoo Finance - December 14th, 2024 [December 14th, 2024]
- Unlocking the Full Power of Quantum Computing With a Revolutionary Superconducting Processor - SciTechDaily - December 14th, 2024 [December 14th, 2024]
- What Googles Willow chip means for the future of quantum computing, AI, and encryption - The Indian Express - December 14th, 2024 [December 14th, 2024]
- Think AI Is Baffling? Heres How to Pretend You Understand Quantum Computing. - Barron's - December 14th, 2024 [December 14th, 2024]