Archive for the ‘Quantum Computer’ Category

Tips on where and when to use a quantum computer – TechHQ

Where and when to use a quantum computer? Its one of the most common questions that experts, such as Kirk Bresniker Chief Architect at Hewlett Packard Labs, get asked by business leaders. Enterprises want to know where in the IT portfolio quantum computers will bring the most significant rewards and when is the right time for firms to invest in solutions.

For decades, quantum computing developers have been promising big things from quantum computers, which is understandable. Quantum computers are costly to develop, and being modest about the technology isnt going to win over investors. However, its important to note that quantum computers arent universal computing devices.

Quantum computing promises transformational gains for solving some problems, but little or none for others, write MIT Sloan School of Management researchers in a paper dubbed The Quantum Tortoise and the Classical Hare submitted to arXiv.

The team, led by Neil Thompson whose career includes appointments at Lawrence Livermore National Laboratories, Bain and Company, The United Nations, the World Bank, and the Canadian Parliament has come up with a simple framework for understanding which problems quantum computing will accelerate (and which it will not).

Quantum computers open the door to probabilistic computing, with quantum gates adding a twist to each of the qubits in the calculation. As the system evolves, the qubits interact and point to the most likely solution to the problem that theyve been arranged to describe.

Prodding a bit further, if we consider classical machines as mapping business questions onto maths a perspective shared by Scott Buchholz, Global Quantum Lead at Deloitte Consulting, at this years D-Wave Qubits conference then quantum computers give us the chance to use physics instead.

It turns out that some questions are easier to map onto physics than others, and this gets to one of the key considerations in the MIT framework on where and when to use a quantum computer.

Much of the talk on progress in quantum computing surrounds the number of qubits. Systems are notoriously noisy, which adds to the number of physical qubits that are required to facilitate error correction on logical qubits. On top of this, there are multiple ways of engineering the superposition of ones and zeros through the use of superconducting, trapped ion, photonic, or silicon spin qubits.

Each quantum computing developer has its own preferred approach, and as you walk down the path of trying to understand how quantum computing works, the discussion becomes one about the technology. And this is fine. Large companies can engage their R&D teams and have conversations with hardware developers.

However, just as you dont need to understand whats happening inside a CPU to benefit from a laptop, companies can focus their attention on the kinds of problems that quantum computers can help with, rather than getting bogged down with the numbers and types of qubits.

In their decision-making framework, Thompson and his colleagues identify two determinants in understanding when to use a quantum computer the efficiency of the algorithm and the scale of the problem that needs to be solved.

The problem size matters because the benefit of an algorithmic advantage is larger for larger problems, explains the team. This means that if a problem is too small, the classical computer will have already completed the problem by the time the quantum computers algorithmic benefit kicks in.

Quantum computers are often mentioned in terms of being able to tackle problems that are effectively impossible with classical machines. But the researchers want to guide enterprises on other opportunities too, where a quantum economic advantage exists.

Their analysis also considers technology roadmaps so that companies can assess when the window for using a quantum computer could open up for them.

Problems that become exponentially harder to solve as the size of the problem increases are interesting candidates when thinking about alternatives to using classical computing machines. And Thompson and his co-authors Sukwoong Choi and William Moses provide a useful rule of thumb.

If a classical algorithm takes exponential time and there exists a polynomial quantum algorithm, youre likely to get a speedup, they comment when discussing their framework on when to use a quantum computer.

Its worth pointing out that companies dont have to invest in bare metal hardware. For most customers, their first experience of what qubits are capable of will be via the cloud using one of a number of QCaaS providers.

Amazon Braket makes it straightforward for firms to work with different types of quantum computers and circuit simulators. Amazon advertises that Braket comes with one free hour of simulation time per month, lowering the cost barrier to getting started.

QCaaS hardware associated with Bracket includes gate-based superconducting processors from Rigetti and OQC, neutral atom-based quantum processors from QuEra, and IONQs gate-based ion-trap processors.

Microsofts Azure Quantum cloud service is another option for firms. Here, users get access to systems from Quantinuum, QCI, and PASQAL, as well as the quantum computing hardware mentioned above.

And companies can also access quantum computing solutions in the cloud using QCaaS platforms operated by developers such as IBM, Google, and D-Wave.

Theres no shortage of options, and with frameworks to guide enterprises on where and when to use a quantum computer, now is a good time to think about the types of algorithms supporting your operations and whether qubits can provide an economic advantage to the bottom line.

Excerpt from:
Tips on where and when to use a quantum computer - TechHQ

Europe has lost the AI race. It can’t ignore the quantum computing one – Euronews

The opinions expressed in this article are those of the author and do not represent in any way the editorial position of Euronews.

Quantum computing's physics-oriented behaviour allows it to be infinitely scalable, which is why Europe has to master this mysterious tech if it wants to avoid a brewing dystopia, Koen Bertels writes.

Europe has become known as a second-place destination for business, and more recently, innovation.

Disruptive technologies like AI have hailed from the United States for decades with no European challenger in sight.

However, when a four-week-old French AI startup secured 105 million for its seed round, it demonstrated that Europe isnt as disadvantaged as people think. While AI is a saturated market, quantum computing can allow Europe to survive in a century ruled by China and the US.

Quantum computing will be the foundation for developing AI that can have the ability to solve real-world problems. That is why Europe must aggressively discover more user applications and increase quantum talent before its competition does.

The European Union is known for many things, but business isnt its speciality, especially in comparison to the US.

In addition to having a massive population and landmass, the United States has defined innovation for decades. Whether were speaking about Silicon Valley, Wall Street, or Hollywood, America has achieved rapid growth and quality before many others.

According to IBM, quantum computing is a "rapidly emerging technology that harnesses the laws of quantum mechanics to solve problems too complex for classical computers."

Quantum computing is meant to solve the problems that normal computers cant. These machines offer more power, speed, and accuracy by storing and processing information in multiple states.

This means one-dimensional binary digits (the 1s and 0s) can be run simultaneously in a quantum computer. But the sensitivity of qubits has kept the tech from fully advancing.

In 2022 alone, quantum technology received $2.35 billion (2.15bn) in investment, since it is crucial for our technology to become smaller, faster, and more powerful.

Although the funding is quite similar between the US and the EU, the Boston Consulting Group has highlighted the weaknesses Europe must fix in order to beat other nations in the quantum race.

According to the firms report, this is "The Tech Race Europe Cant Afford To Lose", but theyre already falling behind as the EU lacks coordination, adequate private funding, and strategies for maximizing talent from the earliest educational points.

The report references Europes failure to reap rewards from the semiconductor industry as evidence that the region will lose again if they dont take a different approach. The group predicts that quantum will create $450-850bn (412.5-780bn) in value in the next 15 to 30 years.

In addition to being behind in funding, talent, and strategy, Europe isnt only competing against the US. China has contributed the largest investment to the industry thus far. The government has claimed to put $15bn (13.75bn) towards quantum research, with the biggest emphasis on quantum computing and software.

This is creating a new AI race solely focused on creating solutions that our tech is running out of options for.

Quantum computing isnt the future of our tech world. Its the present. Our most advanced devices use silicon computer chips that have billions of transistors on them and are nanometers in size.

These tiny semiconductors are responsible for controlling voltage and switch gates, but theyre almost incapable of shrinking.

The industry wont be able to make smaller, more efficient chips because they will begin behaving like quantum creations, which will require computing to explore this sector.

Although we wont have quantum computers on the market for a long time, Europe needs to research what we can use them for and teach people that they should join the industry because our security depends on it.

Quantum computing will empower governments, companies, and any other owner of this advanced tech to defeat the most complex military, intelligence, and biosecurity threats.

These ultra-fast computers will be able to process vast amounts of satellite data, develop vaccines for viral mutations, simulate nuclear weapon attacks to formulate defence strategies, and even surpass the encryption of highly classified government documents.

If Europe doesnt obtain more talent, funding, and researchers to discover the power of quantum computing, someone else will.

Quantum computing can find the most precise answer from billions of data points because of its non-deterministic thinking. Its physics-oriented behaviour allows it to be infinitely scalable, which is why Europe has to master this mysterious tech if it wants to avoid a brewing dystopia.

US and Chinese companies are already racing to develop this tech better than their adversaries.

IBM and Google recently gave $100m and 50m (91.6m and 45.8m) respectively to US and Japanese universities to research quantum since China is working to advance its own programs.

Jiuzhang, Chinas newest quantum computer, is 180 million times faster on AI tasks and is capable of solving problems in a second that would take a supercomputer hundreds of years to solve.

Meanwhile, Europe is struggling to leverage its public investments to counteract Beijing and Washington DC.

To be a respected world power, Europe needs a clearer strategy for utilising private funding, attracting global talent, and finding breakthroughs.

If they fail, the regions national security will be compromised by this world-destroying tech.

Dr Koen Bertels is an internationally acclaimed professor currently teaching quantum engineering at the University of Ghent. He is the founder of QBee, a full-stack quantum computing accelerator.

At Euronews, we believe all views matter. Contact us at view@euronews.com to send pitches or submissions and be part of the conversation.

More:
Europe has lost the AI race. It can't ignore the quantum computing one - Euronews

Quantum Computing Inc. to Present at the Emerging Growth … – PR Newswire

LEESBURG, Va., Nov. 27, 2023 /PRNewswire/ -- Quantum Computing Inc. ("QCi", "we", "our" or the "Company") (Nasdaq: QUBT), an innovative, quantum optics andnanophotonics technology company,invites individual and institutional investors as well as advisors and analysts, to attend its real-time, interactive update presentation at the Emerging Growth Conference on December 6, 2023.

The next Emerging Growth Conference is being held on December 6-7, 2023. This live, interactive online event will give existing shareholders and the investment community the opportunity to interact with the Company's Co-Founder & CEO RobertLiscouski in real time.

Mr. Liscouski will deliver a brief update presentation and may subsequently open the floor for questions. Please submit your questions in advance to [emailprotected]or ask your questions during the event and Mr. Liscouski will do his best to get through as many of them as possible.

Quantum Computing Inc. will be presenting at 3:40 PM Eastern time for approximately 10 to 15 minutes. This event is an update to the Company's initial presentation delivered on November 2, 2023.

Please register here to ensure you are able to attend the conference and receive any updates that are released: https://goto.webcasts.com/starthere.jsp?ei=1641026&tp_key=4a8f04de2b&sti=qubt

If attendees are not able to join the event live on the day of the conference, an archived webcast will also be made available on EmergingGrowth.com and on the Emerging Growth YouTube Channel, http://www.YouTube.com/EmergingGrowthConference.

About the Emerging Growth Conference

The Emerging Growth conference is an effective way for public companies to present and communicate their new products, services and other major announcements to the investment community from the convenience of their office, in a time efficient manner.

The Conference focus and coverage includes companies in a wide range of growth sectors, with strong management teams, innovative products & services, focused strategy, execution, and the overall potential for long term growth. Its audience includes potentially tens of thousands of Individual and Institutional investors, as well as Investment advisors and analysts.

All sessions will be conducted through video webcasts and will take place in the Eastern time zone.

About Quantum Computing Inc. (QCi)

Quantum Computing Inc. (QCi) (Nasdaq: QUBT) is an innovative, quantum optics and nanophotonics technology company on a mission to accelerate the value of quantum computing for real-world business solutions, delivering the future of quantum computing, today. The company provides accessible and affordable solutions with real-world industrial applications, using nanophotonic-basedquantum entropy that can be used anywhere and with little to no training, operates at normal room temperatures, low power and is not burdened with unique environmental requirements. QCi is competitively advantaged delivering its quantum solutions at greater speed, accuracy, and security at less cost. QCi's core nanophotonic-based technology is applicable to both quantum computing as well as quantum intelligence, cybersecurity, sensing and imaging solutions, providing QCi with a unique position in the marketplace. QCi's core entropy computing capability, the Dirac series, delivers solutions for both binary and integer-based optimization problems using over 11,000 qubits for binary problems and over 1000 (n=64) qubits for integer-based problems, each of which are the highest number of variables and problem size available in quantum computing today.Using the Company's core quantum methodologies, QCi has developed specific quantum applications for AI, cybersecurity and remote sensing, including its Reservoir Photonic Computer series (intelligence), reprogrammable and non-repeatable Quantum Random Number Generator (cybersecurity) and LiDAR and Vibrometer (sensing) products. For more information about QCi, visitwww.quantumcomputinginc.com.

SOURCE Quantum Computing Inc.

See the article here:
Quantum Computing Inc. to Present at the Emerging Growth ... - PR Newswire

Lets Create the Next Generation of Innovators – Duke University

A version of this op-ed was published in the Raleigh News & Observer on November 24, 2023. That version is available on the News & Observers website.

As a young professor at N.C. State, Jim Goodnight in the mid-1970s teamed with colleagues to build software to analyze agricultural data. That N.C. State team turned a good idea into a great one, spinning that innovation into a product line that birthed SAS, the Cary-based software giant that recorded $3 billion in sales last year and employs more than 12,000 people.

Thats the sort of success story we need more of here in North Carolina, which is why the CHIPS and Science Act is so important. The Tar Heel State and rest of America are on the precipice of a transformational era for our nations research and innovation enterprise, spurred largely by the work of our research universities. The CHIPS and Science Act signed into law last year included a $52 billion boost to the semiconductor industry a sector where North Carolina companies are well positioned to create new jobs and boost the economy. It would also provide $200 billion to further strengthen the nations competitive advantage in other fields such as artificial intelligence, quantum computing, energy sciences and bioengineering. This money has been approved but not yet distributed, and time is wasting.

North Carolina is well positioned to capitalize on this investment, but Congress must prioritize this funding in the current and future budget cycles to ensure the nation stays ahead in the increasingly competitive race for global leadership in science and innovation.

The universities in North Carolina are extraordinarily successful in winning research funding; Duke ranks 9thnationally in federal research funding and brings in about $776 million of the more than $2 billion of federal funds that support university research in our state each year. These dollars fuel discoveries that become solutions we all need. The funding attracts and retains talent to our state, provides jobs and prosperity for North Carolinians, and generates long-term and sustainable benefits when companies that are born here decide to stay here. In the last 5 years, Duke researchers have launched 75 companies around Duke intellectual property; 55 of them, including Sparta Biosciences, which has developed a new chemically engineering cartilage to help people with cartilage degeneration, have stayed right here in North Carolina.

Building this economic engine doesnt occur overnight or even over a few years. It requires long-term and sustained investment and a highly trained workforce.

We face increasingly tough competition for talent as other countries, both allies and adversaries, are substantially increasing investments in science and technology and other STEM fields. Full funding of the science portion of the CHIPS and Science Act will expand opportunities for North Carolina and the country to cultivate and retain homegrown talent and continue to attract the very best from across the globe.

One example of this is the National Science Foundation (NSF) Regional Engines program, which seeks to build innovation capacity across the country. Duke is a partner on a proposal led by UNC Wilmington to unite universities, community colleges, non-profits and businesses to build and sustain coastal and climate resiliency in Eastern North Carolina. This program has great promise to transform regions in North Carolina, and across the country. But NSF currently only has enough funding to support its current round of applicants.

Similarly, our Duke Quantum Center, in downtown Durham, is a major player in large-scale information processing, building ever-larger quantum computer systems. North Carolina could be well positioned to be a leader in quantum computing if the promise of CHIPS and Science is realized.

Were ready for the next step.

Academic research and development is a federal partnership that has galvanized the states economy for more than 60 years and one that must remain robust if we want to continue that momentum. The CHIPS and Science Act will further catalyze North Carolinas leadership in discovery-based research, but current projections show a $7 billion funding shortfall from the original spending targets. If not fully funded, we will see further stagnation of the nations economic growth, defense capabilities and global competitiveness.

If we want the great innovations to grow from our soil and benefit our citizens, we need Congress to start distributing the money it approved for use a year ago.Lets create the next generation of innovators.

Vincent Price is president of Duke University.

Read more from the original source:
Lets Create the Next Generation of Innovators - Duke University

Embracing Transformation: AWS and NVIDIA Forge Ahead in … – Nvidia

Amazon Web Services and NVIDIA will bring the latest generative AI technologies to enterprises worldwide.

Combining AI and cloud computing, NVIDIA founder and CEO Jensen Huang joined AWS CEO Adam Selipsky Tuesday on stage at AWS re:Invent 2023 at the Venetian Expo Center in Las Vegas.

Selipsky said he was thrilled to announce the expansion of the partnership between AWS and NVIDIA with more offerings that will deliver advanced graphics, machine learning and generative AI infrastructure.

The two announced that AWS will be the first cloud provider to adopt the latest NVIDIA GH200 NVL32 Grace Hopper Superchip with new multi-node NVLink technology, that AWS is bringing NVIDIA DGX Cloud to AWS, and that AWS has integrated some of NVIDIAs most popular software libraries.

Huang started the conversation by highlighting the integration of key NVIDIA libraries with AWS, encompassing a range from NVIDIA AI Enterprise to cuQuantum to BioNeMo, catering to domains like data processing, quantum computing and digital biology.

The partnership opens AWS to millions of developers and the nearly 40,000 companies who are using these libraries, Huang said, adding that its great to see AWS expand its cloud instance offerings to include NVIDIAs new L4, L40S and, soon, H200 GPUs.

Selipsky then introduced the AWS debut of the NVIDIA GH200 Grace Hopper Superchip, a significant advancement in cloud computing, and prompted Huang for further details.

Grace Hopper, which is GH200, connects two revolutionary processors together in a really unique way, Huang said. He explained that the GH200 connects NVIDIAs Grace Arm CPU with its H200 GPU using a chip-to-chip interconnect called NVLink, at an astonishing one terabyte per second.

Each processor has direct access to the high-performance HBM and efficient LPDDR5X memory. This configuration results in 4 petaflops of processing power and 600GB of memory for each superchip.

AWS and NVIDIA connect 32 Grace Hopper Superchips in each rack using a new NVLink switch. Each 32 GH200 NVLink-connected node can be a single Amazon EC2 instance. When these are integrated with AWS Nitro and EFA networking, customers can connect GH200 NVL32 instances to scale to thousands of GH200 Superchips

With AWS Nitro, that becomes basically one giant virtual GPU instance, Huang said.

The combination of AWS expertise in highly scalable cloud computing plus NVIDIA innovation with Grace Hopper will make this an amazing platform that delivers the highest performance for complex generative AI workloads, Huang said.

Its great to see the infrastructure, but it extends to the software, the services and all the other workflows that they have, Selipsky said, introducing NVIDIA DGX Cloud on AWS.

This partnership will bring about the first DGX Cloud AI supercomputer powered by the GH200 Superchips, demonstrating the power of AWSs cloud infrastructure and NVIDIAs AI expertise.

Following up, Huang announced that this new DGX Cloud supercomputer design in AWS, codenamed Project Ceiba, will serve as NVIDIAs newest AI supercomputer as well, for its own AI research and development.

Named after the majestic Amazonian Ceiba tree, the Project Ceiba DGX Cloud cluster incorporates 16,384 GH200 Superchips to achieve 65 exaflops of AI processing power, Huang said.

Ceiba will be the worlds first GH200 NVL32 AI supercomputer built and the newest AI supercomputer in NVIDIA DGX Cloud, Huang said.

Huang described Project Ceiba AI supercomputer as utterly incredible, saying it will be able to reduce the training time of the largest language models by half.

NVIDIAs AI engineering teams will use this new supercomputer in DGX Cloud to advance AI for graphics, LLMs, image/video/3D generation, digital biology, robotics, self-driving cars, Earth-2 climate prediction and more, Huang said.

DGX is NVIDIAs cloud AI factory, Huang said, noting that AI is now key to doing NVIDIAs own work in everything from computer graphics to creating digital biology models to robotics to climate simulation and modeling.

DGX Cloud is also our AI factory to work with enterprise customers to build custom AI models, Huang said. They bring data and domain expertise; we bring AI technology and infrastructure.

In addition, Huang also announced that AWS will be bringing four Amazon EC2 instances based on the NVIDIA GH200 NVL, H200, L40S, L4 GPUs, coming to market early next year.

Selipsky wrapped up the conversation by announcing that GH200-based instances and DGX Cloud will be available on AWS in the coming year. You can catch the discussion and Selipskys entire keynote on AWSs YouTube channel.

View post:
Embracing Transformation: AWS and NVIDIA Forge Ahead in ... - Nvidia