Archive for the ‘Quantum Computing’ Category

$5M NSF Grant to Fund Research on Quantum Internet Foundations – Maryland Today

The National Science Foundation (NSF) today announced a $5 million, two-year award to a University of Maryland-led multi-institutional team to develop quantum interconnectscrucial technology to connect quantum computers and pave the way for a quantum internet.

The team, QuaNeCQT (Quantum Networks to Connect Quantum Technology), has been developing the quantum versions of a modem and a routerfamiliar equipment in the world of standard, or classical computing, but a challenge to build for use with devices that operate based on the principles of quantum.

The devices allow ion trap quantum computersa leading approach to quantum information processing developed in part at the University of Marylandto exchange quantum information over distances measured in kilometers, eventually leading to the development of networks that could revolutionize numerous industries and help solve vexing societal problems.

Quantum networks are at an inflection point with the potential for significant expansion, said Edo Waks, a professor of electrical and computer engineering and associate director of UMDs Quantum Technology Center (QTC). But the scale-up cant happen without standardized modular hardware between the new computers that are emerging and the vast infrastructure of the current internet.

The hardware we are developing will address the critical gap, opening up the door to the future quantum internet that can connect quantum computers over continental distances, said Waks.

Other UMD team members include physics Assistant Professor and QTC Fellow Norbert Linke, and Mid-Atlantic Crossroads (MAX) Executive Director Tripti Sinha, assistant vice president and chief technology officer for UMDs Division of Information Technology. The team also includes Dirk Englund of the Massachusetts Institute of Technology and Saikat Guha of the University of Arizona.

The researchers plan to deploy this new technology in the Mid-Atlantic Region Quantum Internet (MARQI), UMD's regional quantum network footprint. The MARQI network will interconnect quantum computers at UMD, the Army Research Laboratory, MAX and IonQa leading quantum computing company focused on ion-trap computers that operates in UMDs Discovery Districtwith a potential for significant expansion.

During the first phase of research, the team developed working prototypes of the quantum router and modem. Using a process called quantum frequency conversion, the modem converts signals from a quantum computer to infrared photons that can propagate through optical fibers over long distances. The router is powered by a silicon photonic chip that manipulates quantum signals in the network using quantum teleportationan effect demonstrated in 2009 by researchers at UMDs Joint Quantum Institute that allows quantum states to be transferred between particles that are physically separate. The team has deployed these prototypes in the MARQI network and established direct links with the various nodes of the network.

A quantum network could revolutionize numerous industries that take advantage of quantum computing including computing, banking, medicine and data analytics It would also enable connection of many multiple small quantum computers into powerful distributed quantum computers that could potentially solve problems with significant societal impact, from curing diseases to new approaches to fighting climate change.

As quantum technology converges with the Internet, a new technology sector would emerge, the researchers say, bringing with it the potential for major economic growth by producing rapid technological innovation and creating a large number of new jobs for the future quantum workforce, just as the emergence of the Internet did toward the late 20th century.

Visit link:
$5M NSF Grant to Fund Research on Quantum Internet Foundations - Maryland Today

IBM Moves to Step Out From the Pack With Quantum and Power 10 – Datamation

IBM recently held two briefings on their Power10 platform and their Partner Ecosystem, which is one of the richest in the market.

For Power10, they had Pfizer talk about how Power has been critical to their operations and success. In addition, in their Partner event, IBM spoke about an effort with Mercedes Benz to use quantum computing to understand battery technology better and create something revolutionary when it comes to stored electrical energy.

Lets talk about both of these efforts by IBM.

IBM has two major cloud initiatives outside of their own IBM Cloud offering. They are the hybrid cloud and the multicloud, and they dovetail with each other nicely. But the dominant server architecture in the cloud is Intels X86, and displacing a technology as dominant as X86 isnt a viable strategy. However, designing a part that does a few things better than X86 is doable, because Intels platform has to be a jack-of-all-trades, making it very difficult for it to be a master of any of them.

Scott Growth, Pfizers ERP architect, indicated that IBMs Power platform and the breakthroughs they have collectively had on it changed many patients lives for the better. He testified that this platform allowed him to deploy 19K virtual threads over the 1,300 cores they have deployed. This ability to massively share CPU resources has been critical to their enterprise SAP deployment through a single instance. They dont think any other platform can provide the same massive workload on a similar relatively small resource.

One of the significant areas of focus is the idea of a frictionless hybrid cloud where data and applications can move seamlessly between the two environments and likely across multiple cloud providers depending on the need. Their new generation, the IBM Power E-1080 Servers, promises 30% more performance and 52% less energy usage over their prior generation. A new memory architecture also promises a 2.5x improvement in memory RAS. With embedded AI capabilities coupled with advanced recovery and automatic self-healing, the Power 10 platform looks as impressive as Pfizer indicated and well-differentiated in the market.

IBM was one of the first companies to research quantum computing, which is expected to be a significant game changer for the kinds of analytical loads. For several years, I was the lead battery analyst for the U.S., and during that time, I visited IBMs labs where they were working on a lithium-ion replacement called lithium-air. That research continues promising lower costs, faster charging, higher power densities, higher energy efficiency, and lower flammability.

But beyond this, IBM shared that they were working with Mercedes Benz, one of the automotive companies attempting to pivot from the internal combustion engine (ICE) to electric vehicles (EVs), aggressively on a new battery architecture. IBMs battery work goes back decades, and the related research primarily used conventional computers. While they are a leader in quantum computing, providing early practical applications of this new computing power has proven daunting.

Using quantum computing to understand better how existing batteries work and then using the related information to create a new battery class is inspired. If successful, this battery advancement effort should not only create a far more capable battery but a leading and prominent example of the benefits of using quantum computing in applied product research. Success would tend to move the perceptions of quantum computing from near fantasy to practical reality, opening up demand for quantum computing tied to practical business applications.

In short, this effort might not just improve batteries. It could validate a general-purpose use for quantum computing far earlier than anticipated, creating a stronger foundation for the eventual birth of a quantum computing market.

IBM recently had two powerful announcements: their Power 10 platform and E-1080 server providing a critical solution for those looking to leverage central computing resources and create frictionless hybrid environments massively; and their work with Mercedes to create a next-generation battery to power tomorrows cars through the IBM Partner Ecosystem.

IBM continues to do significant research and create unique product innovations that could eventually change the world.

See original here:
IBM Moves to Step Out From the Pack With Quantum and Power 10 - Datamation

Explore Trends and COVID-19 Impact on Quantum Computing Market 2021 Research Report and Industry Forecast till 2027 | Know More Stillwater Current -…

The latest industry report entitled theGlobal Quantum Computing Market 2021that focuses on market analyzes of the important factors with an in-depth approach and enables the user to assess the long-term based demand also predicts specific executions. This report provides qualitative analysis, explaining product scope and elaborating industry insights and outlook to 2026. Theglobal Quantum Computing market is a significantreferral for crucial and well-known players in the current market. The information itemized in the report offers an exhaustive appraisal of the major dynamics of the Quantum Computing market like the opportunities, market trends, limits, and business strategies. In addition, the report also shows the present fundamental industry events together with their relevant effect on the market. The market study report also involves the top key players in the global Quantum Computing market such as (Google, IBM, DWave, Intel, Microsoft, 1QBIT, Anyon Systems, Cambridge Quantum Computing, ID Quantique, IonQ, QbitLogic, QC Ware, Quantum Circuits, Qubitekk, QxBranch, Rigetti Computing).

Get Free Sample PDF (including COVID19 Impact Analysis, full TOC, Tables and Figures) of Market Report @https://www.syndicatemarketresearch.com/sample/quantum-computing-market

The research report mainly covers the present market growth rates (%) of theGlobal Quantum Computing marketand its size on the basis of recent 5 years history data besides the company profile of top players/producers. The top to bottom statistics by segments of the market assist to monitor future benefits & to settle on crucial decisions for advancement. The report focuses on developments and trends, markets and materials, SWOT analysis, scope, technologies, CAPEX cycle and the changing format of The Global Quantum Computing Market.

Global Quantum Computing Market Analysis by Manufacturers/Players, by Product Type, Application, and Regions

Moreover, this report portrayed the primary product type, segments and sub-segments of the industry. This brief outline includes the business overview, revenue share, latest events, product offering, methods, and administration offering of the dominant players. An accurate appraisal of the leading organizations, together with their strategic aptitudes, containing innovation, cost, and consumer satisfaction have been included in this study report relating to themarket. The raw numbers incorporated into the worldwide Quantum Computing market report are included with the recognition and contribution from a global group of talented experts to give an up-to-date situation of the current advancements in the market.

The report provides comprehensive guidelines on the essential methodologies that have energized the market development nearby the technique that would be victorious in the expected time. The report also incorporates geographically of the Quantum Computing market as North America, South America, Europe, the Middle East and Africa, and the Asia Pacific.

Do Inquiry For Customization & More Info Here:https://www.syndicatemarketresearch.com/inquiry/quantum-computing-market

This intensive regional assessment provides the readers with a clear view of the most persuasive trends prevailing in each geographies area. Aside from this, the report additionally covers industry size and offers of these regions, together with expected measurements, which are useful for organizations in understanding the consumption growth of these regions. In addition, the worldwide Quantum Computing market is surveyed as far as income (USD Million) and volume.

The Global Quantum Computing Market divided by Product Type such as (Hardware, Software, Services). Further, this analysis report is segmented by Application/end users like (Simulation, Optimization, Sampling) based on historical and estimated industry share and compounded annual growth rate (CAGR in %) with sizeand Revenue (Million USD).

The analysis has utilized scientific instruments such as competitive overview helps and Porters Five Forces Analysis in translating the extent of strategies related to the usage in the global Quantum Computing market in the anticipated stage.

This Study Report Offers Following Objectives:

1. Forecast and analysis of the global Quantum Computing market sales, share, value, status (2016-2018) and forecast (2021-2026).2. Analyze the regionalas well as country level segments, share evolution for global Quantum Computing Market.3. Analysis of global industry-leading manufacturers/players.4. Define and analyze the market competition landscape, SWOT analysis.5. Forecasts and analysis of the segments, sub-segments and the regional markets based on the last of 5 years market history.6. Analysis of the Quantum Computing market by Type, by Application/end users and region wise.7. Forecast and analysis of the Global Quantum Computing Market Trends, Drivers, Investment Opportunities, Openings, Risk, Difficulties, and recommendations.8. Analyze the significant driving factors, trends that restrict the market growth.9. Describe the stakeholders opportunities in the market by identifying the high-growth segments.

There are 15 Key Chapters Covered in the Global Quantum Computing Market:

Chapter 1, Industry Overview of Global Quantum Computing Market;Chapter 2, Classification, Specifications and Definition of market segment by Regions;Chapter 3, Industry Suppliers, Manufacturing Process and Cost Structure, Chain Structure, Raw Material;Chapter 4, Specialized Information and Manufacturing Plants Analysis, Limit and Business Production Rate, Manufacturing Plants Distribution, R&D Status, and Technology Sources Analysis;Chapter 5, Complete Market Research, Capacity, Sales and Sales Price Analysis with Company Segment;Chapter 6, Analysis of Regional Market that contains the United States, Europe, India, China, Japan, Korea & Taiwan;Chapter 7 & 8, Quantum Computing Market Analysis by Major Manufacturers, The segment Market Analysis (by Type) and (by Application);Chapter 9, Regional Market Trend Analysis, Market Trend by Product Type and by Application:Chapter 10 & 11, Supply Chain Analysis, Regional Marketing Type Analysis, Global Trade Type Analysis;Chapter 12, The global Quantum Computing industry consumers Analysis;Chapter 13, Research Findings/Conclusion, deals channel, traders, distributors, dealers analysis;Chapter 14 and 15, Appendix and data source of Quantum Computing market.

Note In order to provide a more accurate market forecast, all our reports will be updated before delivery by considering the impact of COVID-19.(*If you have any special requirements, please let us know and we will offer you the report as you want.)

About Syndicate Market Research:

At Syndicate Market Research, we provide reports about a range of industries such as healthcare & pharma, automotive, IT, insurance, security, packaging, electronics & semiconductors, medical devices, food & beverage, software & services, manufacturing & construction, defense & aerospace, agriculture, consumer goods & retailing, and so on. Every aspect of the market is covered in the report along with its regional data. Syndicate Market Research committed to the requirements of our clients, offering tailored solutions best suitable for strategy development and execution to get substantial results. Above this, we will be available for our clients 247.

Contact Us:

Syndicate Market Research244 Fifth Avenue, Suite N202New York, 10001, United States+1 347 535 0815 |Email ID:sales@syndicatemarketresearch.comWebsite:www.syndicatemarketresearch.com |Blog:Syndicate Market Research Blog

Read more:
Explore Trends and COVID-19 Impact on Quantum Computing Market 2021 Research Report and Industry Forecast till 2027 | Know More Stillwater Current -...

DoD Buys Two New Supercomputers That Rank Among Its Most Powerful Ever – Breaking Defense

Sandia National Laboratory Computer Annex conducts the hourly walk-through of the Thunderbird supercomputer at 2 a.m.

WASHINGTON: The Pentagon recently completed a $68 million acquisition of two new supercomputing platforms and related technical services that rank among its most powerful supercomputers ever and will be among the top 100 performers globally.

These are significant assets, Kevin Newmeyer, deputy director of the Defense Departments High Performance Computing Modernization Program (HPCMP), told Breaking Defense. They bring to us an increase in our computing capacity and the latest advanced chips for artificial intelligence work and storage to support applications of both computational and machine learning concepts within the same computer that we hope will deliver products and services to the warfighter faster.

Its the HPCMPs job to give DoD military and civilian as well as defense contractor scientists, engineers, and technologists access to such supercomputers to solve some of the militarys most computationally complex problems.

The problems range from climate/weather/ocean modeling and simulation, space/astrophysical sciences, and acoustics to signal/image processing, data/decision analytics, and electronics, networks, and C4I systems. Newmeyer said the most common use case is computational fluid dynamics, which is required for making complicated calculations in areas such as aircraft and ship design and engineering.

For the latest acquisition, the Pentagon chose Penguin Computings TrueHPC supercomputing platform. The two new supercomputers, according to the company, will provide DoD with a combined total of over 365,000 cores, more than 775 terabytes of memory, and a total of 47 petabytes of high-performance storage, including over 5 petabytes of high-performance flash storage.

Thats about 150,000 computers all stacked together, operating as one thing, Newmeyer said. If you laid them end to end, you would work your way pretty much across the country.

What does all that compute power get you? An additional 17.6 petaFLOPS, in total. FLOPS or floating point operations per second are the standard measure of a supercomputers performance. FLOPS are determined by how many real numbers a computer can process per second while accounting for the trade-off between range and precision of calculations.

FLOPS are a measure of computational power for solving computer-based problems. Its the horsepower of a machine, Penguins Vice President of Federal Sales Tom Ireland told Breaking Defense.

PetaFLOPS number one quadrillion (1,000,000,000,000,000). To put that in perspective, HPCMP currently has a total capacity across all of its supercomputers of approximately 100 petaFLOPS, according to Newmeyer. That includes the Navys most powerful (known) supercomputer, Narwhal, which is capable of 12.8 petaFLOPS. The known part of the Air Forces most powerful supercomputer, Mustang, is capable of 4.87 petaFLOPS. (Part of Mustang is classified, Newmeyer noted.) Penguins two TrueHPC supercomputers expected to register at 8.5 petaFLOPS and 9 petaFLOPS will be two of HPCMPs most powerful computers ever, Ireland said.

According to the Top500 Project, the fastest supercomputer in the world, as of June 2021, is Japans Fugaku, which registered 442.01 petaFLOPS in November 2020, taking the top spot from IBMs Summit (148.6 petaFLOPS), which is housed at the Department of Energys Oak Ridge National Laboratory.

The Pentagons upgrade in supercomputing power comes amid an intense technological race against near-peer rival China. According to the Top500, China currently leads the world in the total number of supercomputers with 188, but when ranked by performance, the US has five of the top 10 most powerful supercomputers in the world, while China has two of the top 10. No other country has more than one in the top 10.

Ireland noted that Penguin, which has been building supercomputers for 20 years, has for years been running programs at the Department of Energy, which has the most powerful (known) supercomputers in the US. Fifteen of Penguins debuts over 20 years have made the Top500, and were DoD to run official benchmarks on these two new supercomputers, they would rank within the top 100 worldwide, Ireland said.

The Navys DoD Supercomputing Resource Center (DSRC) at Stennis Space Center in Mississippi will house one of the new platforms, while the other will go to the Air Force Research Labs DSRC at Wright-Patterson Air Force Base in Dayton, Ohio.

But first Penguin has to build, deploy, and integrate them into HPCMPs network, known as the Defense Research Engineering Network (DREN). Ireland said Penguins TrueHPC consists of about 1,500 nodes, which must be engineered to work as one, giant machine.

The trick with distributed computing meaning its taking what heretofore was done on a mainframe-style computer where its all on a board, and its broken up into separate, discrete servers is making sure that is an adequate platform for any given application, Penguins Chief Strategy Officer Matt Jacobs told Breaking Defense. To make sure that balance between the elements is right and theres an appropriate amount of compute to solve the problem.

Jacobs said some of the key elements include data patterns, network traffic, and storage capacity, which all must be brought together in a way that doesnt strand investment in any given element of those resources and that its an effective production platform for the workload application. Thats really the art, he added.

Jacobs said that Penguin generally builds these types of platforms in a couple of months, but like many companies worldwide, Penguin has encountered challenges in the global supply chain, especially around chips. Jacobs and Ireland said the supply chain hiccups are beyond the companys control, but said they still wouldnt significantly delay the project.

Notably, the platforms will include over 100 NVIDIA graphics processing units, or GPUs, to bolster DoDs AI and machine learning capabilities, Ireland said.

Ultimately, Ireland said, the project is about keeping the US warfighter equipped with state-of-the-art technologies to solve compute problems. Were keeping our warfighters current. You dont want them fighting wars with F-14s when theres F-22s.

Its unclear how long the era of supercomputers will last, as the US and China, among others, race ahead towards quantum computing, which uses quantum mechanics to make a technological leap in processing power. But Newmeyer said hes not concerned traditional supercomputing platforms will become obsolete anytime soon.

Youll still have a use for these types of machines, he said. Any quantum computer built in the near future is going to be highly expensive to operate, and [quantum computers] are only more useful for certain applications maybe in some stuff around hypersonics, certainly cryptology, navigation there quantum has a key role. But for general computation, [quantum] is an awful lot of money.

Read the rest here:
DoD Buys Two New Supercomputers That Rank Among Its Most Powerful Ever - Breaking Defense

Why We Live In A Golden Age Of Innovation – Forbes

Self-organizing team

Despite horrific daily headlines, the important news is that we are living in a golden age of innovation. It has been called the digital age, among other names, although that label is wrongly taken to imply that this is just about technology. In reality its a combination of both new technology and a radically different kind of management. It is the combination of bothnew technology and a new managementthat makes things fundamentally different from what was happening in the 20th century.

The new digital technologies are amazingthe Internet, the cloud, algorithmic decision making, block chain, artificial intelligence, quantum computing, and 3-D printing. Moreover the technologies are evolving rapidly and, as Vivek Wadhwa et al point out in their book, From Incremental To Exponential (Berrett-Koehler, 2020), they are interacting with each other to create even further possibilities.

Yet alone, technology doesnt make much difference. We have learned over the last two decades that few benefits ensue from the new technologies unless there is also a different mindset towards managinghuman beings creating value for other human beings. It is a new meta-model of management that is fundamentally different from industrial-era management, which was based on efficiency and outputs. Firms such as Haier and Microsoft, among scores of other firms, have demonstrated the new way, not just in tech, but in cars, finance, health, agriculture, music, movies, retail, restaurants, gaming and hoteling. The new management mindset includes.

Embracing a goal of creating value for the customer as the primary foundation of everything the firm does.

Unleashing talent in self-organizing teams and micro-enterprises. Talent is now driving strategy, rather than vice versa.

Operating as a network of competence, rather than a hierarchy of authority.

Enabling a firm to create new businesses, new business models, platforms and ecosystems, and managing data as an asset.

And once both technology and management are in place, then dramatic benefits start appearing, along three dimensions.

First, the impressive benefits for customers are almost magic, transforming how we work, how we communicate, how we go about, how we shop, how we how we play games, how we deliver health care and education, how we raise our children, how we entertain ourselves, how we read, how we listen to music, how we watch theater, go to the movies, and how we worship: in short how we live. This combination of new technology and new ways of running companies is changing most of our lives, even those in developing countries.

Second, it has changes the workplace, potentially for the better. When those doing the work are collaborating in self-organizing teams, focused on delivering value for customers, work can be meaningful and uplifting. At its best, human beings are delivering value for other human beings, as opposed to individuals producing things in accordance with instructions from bosses.

Third, it is much more profitable for the firms themselves, once they get fully into this mode. Microsoft is a striking example. It made a commitment to the new way in 2014, under the leadership of CEOand now chairmanSatya Nadella. Since his taking over and implementing this different way of running a company, Microsoft has added $1.5 trillion to its market capitalization.

Some critics ask whether there is anything really new in the new way of managing. And indeed, management innovators have been working on some of these changes for at least a century, beginning with Mary Parker Follett in the 1920s. Yet until recently, there was little enduring success, as Art Kleiner noted in his book, The Age of Heretics (Jossey-Bass, 2008).

As Gary Hamel explained last year, You can go back and read about the precursors of the Agile management, about early attempts at building self-managing teams, about more participatory decision structures. A lot of this work, in the 60s and 70s, produced extraordinary resultshuge gains in productivity and engagementbut few of the changes scaled up. Most of these efforts were ultimately aborted or marginalized. In the end, the empire struck back.

Thus, this new meta-model of management isnt just another variant of 20th century management. True, it is not yet everywhere. But the extraordinary gains being made by firms that have embraced the new meta-model create massive incentives for other firms to make the shift, as well as huge disincentives not to make the shift.

What makes Microsoft such a remarkable story is that it is often cited as an example of a stagnant bureaucracy that would never change. Its transformation shows that change is possible.

Meanwhile, many big old industrial-era firms are struggling. Thats why its dangerous to think of the new age as the fourth industrial age, pace Klaus Schwab and his book The Fourth Industrial Revolution (Currency, 2017). If firms think of this as a continuation, or evolution, of the industrial age, they are unlikely to succeed with the new meta-model of management needed for handling digital technologies.

Nobel-Prize-winning economist Edmund Phelps has suggested in his book, Mass Flourishing: How Grassroots Innovation Created Jobs, Challenge, and Change (Princeton, 2013), that we could be on the brink of a Mass Human Flourishing. And when we see what is happening in the best firms today, it begins to look plausible. But there is also a legitimate concern that we could be on the brink of a Mass Human Repression, if these new technologies were to be used for malign purposes. This is a choice that societies around the world are now facing. Are we heading, as Phelps suggests, towards a Mass Human Flourishing? Or a Mass Human Repression? The choice is ours. But to make that choice, we first have to understand it.

How The Digital Age Is Reinventing (Almost) Everything

How Microsofts Digital Transformation Created A Trillion Dollar Gain

See the original post here:
Why We Live In A Golden Age Of Innovation - Forbes