Archive for the ‘Quantum Computing’ Category

Investors like the appeal of Quantum computing, but reward comes with risk – iTnews

The announcement last night that Canberra based QuintessenceLab had completed a $25 million Series B capital raising demonstrates the interest in the emerging field however, like all venture investments, it comes with risk, especially in an immature field such as quantum computing.

The fundraising was led by CSIRO founded Main Sequence and TELUS Ventures, with participation from Mizuho Financial Group-backed InterValley Ventures and Terry Snows Capital Property Group.

According to business statistics site Statista, revenues for the global quantum computing market are projected to exceed $US1.76 billion by 2026 up from $472 million this year, an average compound annual growth rate of a tick above 30 per cent. By way of spurious comparisons, the size of the global for loot boxes (basically lucky dips inside competitive computing games) in 2020 was $US15 billion.

The discipline's novelty means it comes with significant risk for investors owing to the nature of the impediments which need to be overcome to ensure widespread commercial deployment.

Newsletter Signup

Get the latest insights and analysis delivered to your inbox.

SIGN UP

Despite the aspirations of entrepreneurs and investors, and the often gimlet-eyed assessments of industry commentators, quantum cyber security and quantum computing more generally remain relatively new fields commercially, despite decades of lab research.

Steady on

A paper by Deloitte authors Itan Barnes, Bartosz Czaszyriski and Ruud Schellekens, calledQuantum computers and their impact on Cyber Security argues that building the hardware for a quantum computer is a formidable challenge.

"Qubits (in loose terms quantum-mechanical analogue of a classical bit) in their nature are very fragile and can lose the information encoded in them very quickly (the decoherence problem)."

In their paper, they write that the major challenge is to keep the qubits completely isolated from the environment while allowing high precision control and readout of the qubit state.

To effectively decouple qubits from any noise source, and therefore sustain longer coherence times, these systems are typically cooled to extremely low temperatures using liquid helium.

This, they say, puts a heavy burden on the size of the system and results in high running costs.

There are many different ways to realise qubits, e.g trapped ions, superconducting rings and many others. Each architecture has its advantages and drawbacks and it is not yet clear which qubit material is the most scalable one.

Still, proponents argue that quantum computing is the next great threat to data and information.

In its funding announcement, QuintessenceLab claims, Quantum computing capabilities and power transcend that of current computing, making todays information vulnerable to quantum computing attacks and data breaches. Organisations need to start assessing their cybersecurity posture from a quantum-safe perspective.

As computing power increases exponentially, the tools needed to secure critical data and assets must stay several steps ahead, said Bill Bartee, Partner at Main Sequence. Dr. Sharma and the team at QuintessenceLabs are global leaders in developing quantum-based cybersecurity tools that help protect sovereign and commercially sensitive information. Were excited to support the QuintessenceLabs team as they scale their business and provide customers with a critical layer of protection.

Define soon

In its funding announcement, QuintessenceLabs says quantum cybersecurity will soon become mainstream and will be one of the critical pillars of a robust cybersecurity strategy for most organisations.

Writing in Telefonica earlier this year Gonzalo lvarez Maran noted that presently, only 1 per cent of organisations are investing in quantum computing and quantum computers, although he does note an expectation that in a decade or two we will be enjoying error-free quantum computers of thousands of qubits."

In the current world of real things, the largest quantum computer is IBM's Quantum System One at 65 bits. It still has aspirations to launch a 127 quantum bit computer this year, although the clock is ticking.

See the rest here:
Investors like the appeal of Quantum computing, but reward comes with risk - iTnews

Quantum algorithms that speed up banking operations 100x are here – Sifted

Forget what Goldman Sachs said about useful quantum computing being five years away in finance. Banks can already get a 100-fold advantage by using quantum computers to solve problems such as portfolio optimisation and fraud detection fraud, says Spanish startup Multiverse Computing.

The company, which raised a 10m seed funding round today, has developed a quantum software product that it is supplying to customers including BBVA, Bankia the European Tax Agency and the Bank of Canada.

For problems like optimising investment portfolios and detecting fraud, quantum computers can already outperform classical computers.

It depends on the problem you are trying to solve, says Enrique Lizaso, chief executive. For some problems like optimising investment portfolios and machine learning to detect banking fraud, quantum computers can already outperform classical computers today.

Tasks like thesecan be done 100 times faster using quantum rather than classical computers, even though todays quantum computers are still relatively limited, with less than 100 qubits and high error rates.

Goldman Sachs which has been working with quantum startup QCWare to test out practical applications has estimated that it would take machines with 7500 qubits and five more years for quantum computing to be of practical use to the financial services industry. Jeremy OBrien, CEO of photonic quantum computing company PsiQuantum, says quantum computers must reach 1m qubits something he believes is a decade away to be useful.

However, by focusing on problems that are particularly well suited to quantum computers, Multiverse says it is possible to get quantum advantage even with todays small, error-prone machines.

The San Sebastian-headquartered company matches the algorithm, on the back end, to a particular type of quantum computer that is best suited for that problem. For example D-Waves machines are good for optimisation problems while IBM and IonQs quantum computers perform better on machine learning, says Lizaso.

Other types of problems can still be tricky to solve with quantum computing, such as working out pricing for complex financial instruments (which relies on running so-called Monte Carlo simulations that QC Ware and Goldman Sachs were looking at) or for large-scale simulations. For these, we are likely to need quantum machines with larger numbers of qubits.

Imagine an early computer in the 1970s and asking it to be able to recognise voice and search the internet it would be impossible. But you could use it to run a spreadsheet.

Imagine an early computer in the 1970s and asking it to be able to recognise voice and search the internet it would be impossible. But you could use it to run a spreadsheet, it would still be useful. Thats what it is like with quantum computers today, says Lizaso.

Given the fierce competition in the industry to hire top talent and to win relationships with the biggest clients, Multiverse wants to grow quickly. The company is aiming to expand the service into other sectors such as energy and grow headcount from the current 27 staff to close to 50 by the end of this year, and to more than 200 by 2027. The company is planning to raise a further 50m next year.

You have to grow like hell at the moment.

You have to grow like hell at the moment, says Lizaso, noting that many of Multiverses competitors are large tech incumbents like IBM and Google, and that quantum software rival Cambridge Quantum Computing recently merged with Honeywell to give itself more firepower.

Multiverses seed round was led by JME Ventures and also included Quantonation, EASO Ventures, Inveready, Mondragn Fondo de Promocin, Ikerlan, LKS, and Penja Strategy.

View post:
Quantum algorithms that speed up banking operations 100x are here - Sifted

The Quantum Leap Hinges on Worker Skills and Supply Chain Limits – Government Accountability Office

Remember that Sci-Fi show Quantum Leap? In the late-1980s/early-90s, Quantum Leap predicted that (in the near future) quantum technologies could be used to, among other things, travel back in time to solve crimes and correct historic wrongs.

Fast-forward to 2021. Where is quantum technology now, and what challenges does it face in its use and development? Todays WatchBlog post explores our recent report on quantum technologies.

What can quantum technology really do?

Well, not time travel!

Quantum Leap is of course fictional. But, in the near future, quantum technologies have the potential to transform many industries by harnessing the properties of nature at atomic (or very small) scales. For example, quantum computers could help reduce the time needed to create a lifesaving drug by increasing the accuracy and speed of chemistry simulations.

However, in addition to the good it could do, in the wrong hands, quantum technology could also be dangerous. For example, quantum technology could be used by hackers or enemy states to disrupt the privacy of information sent over the internetpotentially stealing secretsby efficiently factoring extremely large numbers, a difficult mathematical task used in many encryption schemes.

How soon could we see this technology?

While quantum technologies are available for limited uses now, they will likely take at least a decade and cost billions to develop for more complex uses.

We interviewed experts, including federal agency officials about quantum technologys prospects. They told us about the need for the U.S. to develop a strong quantum workforce and address supply chain issues in order to maintain its leadership position in quantum technology hardware and software development.

What can be done to strengthen the workforce?

Right now, the U.S. educational system does not equip enough graduates with the right skills to undertake complex tasks required for quantum technology development. For example, other countries provide far more training in the production of superconducting circuits, an essential quantum computing component, according to 1 stakeholder. Further, quantum jobs often require skills in fields that dont normally go together, like engineering and quantum mechanics. As a result, there are more quantum technology jobs than people to fill them.

To build a quantum workforce, the U.S. may need more training capacity in quantum engineering, circuit design, logic, and algorithm development. U.S. quantum firms may also want to hire more foreign nationals, which may be challenging because of visa requirements and the national security implications of quantum technology.

What can be done to address supply chain issues?

The quantum technology supply chain is global and specialized. Given the complexity of the supply chain, if a single link in the chain is unavailable, that could cause technology development delays and other setbacks. For example, quantum computers require super-cooling devices called dilution refrigerators, which are manufactured in Finland.

The quantum technology market is not currently large enough to support the commercial availability of specialized components. Manufacturing facilities, such as foundries, make money by producing a large quantity of an item. The small number of items needed for quantum technology research may not provide manufacturers financial incentive. Quantum foundries, or specialized manufacturing facilities, could help to produce quantum technologies at scale. In particular, they could help develop materials needed to enable quantum technologies, train the quantum workforce, and accelerate quantum technology development.

To learn more about the opportunities of quantum technology and the challenges facing its development, check out our new report.

See the rest here:
The Quantum Leap Hinges on Worker Skills and Supply Chain Limits - Government Accountability Office

Innovations Shaping the Future – Qrius

It took humans over 65 thousand years to invent the wheel and it was one of the most breakthrough technological revolutions in the history of humanity. Today, almost six thousand years later, several not less important revolutions happen each year and the pace of changes continues accelerating exponentially.

New technologies literally change the world making our tomorrow mysterious and unpredictable. But what can we do to embrace the future? Well, we, unfortunately, cannot make accurate predictions, but we can definitely keep track of the trends of software development in Europe appearing on the horizon.

In this article, we outlined technologies that have a potential to spread around the globe in the next 1-10 years. So lets look under the veil of time to come together!

The idea of quantum computing is not quite new. Scientists started thinking about it at the beginning of the 1980s and, it is still in the early stage of research and development. Nevertheless, quantum computing has already become the area of focus in the tech world and its anticipated to turn into ageneraltrend in softwaredevelopmentwithin the next decade. So what is so special about quantum computers?

In quantum computing, data is encoded in so-called quantum bits (qubits) that can be in superpositions of states. In simple words, this means that a qubit can represent 0 and 1 at the same time, unlike regular bits that can be either in a state of 0 or 1. The superpositions allow qubits to increase computing capacity, minimize the risk of error, make accurate predictions and significantly improve data encryption up to the level when its impossible to hack it. Its expected that quantum computers will eventually outperform traditional computers, even the most powerful ones.

The largest tech corporations are now racing to build the first quantum chip that will actually work as it supposed to, but the main problem is that its quite hard to maintain a qubit in a superposition for a long (or at least sufficient) period of time as it tends to collapse into particular state (0 or 1) when external conditions are changed.

Google, Intel, IBM, and Microsoft have already tried to build quantum computers, but all of them have way too few qubits to have a real commercial value. Besides, China is to build a $10 billion laboratory to conduct quantum computing experiments.

Its expected that well have the first real use cases within the next 10 years or so. Yet, the exact date of quantum supremacy, i.e. the time when a quantum computer will be able to do something that common computers cannot do, is still hard to predict.

Voice search is not brand-newtechnology in software developmentbut it is one ofthe most populartech trendsnowadays. The reason is that its expected to replace regular searches via touchscreens and keyboards in the nearest year or two.

The above predictions are based on the statistics that are quite impressive. For example, in 2017, 1 in 4 shoppers in the US used voice assistants to do their holiday shopping and its anticipated that about 50% of all online searches will be voice searches by 2020.

Speaking about voice-activated engines, Amazon Alexa, a virtual assistant developed by Amazon, holds its leadership position on the market, but its rivals do not lag behind. Google, Microsoft and other companies are working to improve the accuracy of speech recognition and expand the domain of possible questions users may ask.

For example, recently, Google announced speakable markup that allows publishers to indicate the parts of articles to be read aloud by Google Assistant, so the later can respond to a users query with the most relevant excerpt and then ask a user if they want to hear the rest of the text.

Thisapp development trendhas also good perspectives inthe long run since chatbots which can recognize and reply to voice commands will eventually replace customer service agents and even the sellers in shopping centers.

Building digital twins is asoftware development trendthat has good chances to conquer manufacturing and engineering industries in the nearest 3-5 years. In simple words, a digital twin is a virtual model of physical buildings, products, systems and processes. Thanks to the numerous sensors and the application of artificial intelligence, digital twins are closely linked to the original objects that allows companies to track the changes in assets and optimize their performance.

Digital twins are widely used in the space industry as the whole idea was developed to help maintain and repair spacecrafts, however, its expected that about 50% of large manufacturers will use this technology by 2021. Hence, exploiting digital twins is turning from atrend in software developmentinto a must-have tool for any business that wants to stand out from competitors.

Virtual Reality (VR) and Augmented Reality (AR) have been used by companies for a few years now, but it seems their popularity hasnt reached its peak yet. According to International Data Corporation, in 2021, total spending on VR and AR projects will constitute around $160 billion that is almost twenty times more than total spending on similar products was in 2017.

Such rapid growth of thesenew technologies in software developmentmay be explained by the fact that the immersive solutions have already proved their commercial relevance and not only in the gaming industry. Businesses no longer look at VR and AR as at interesting-to-try things; companies start considering them as tools to reshape their business activity and take interaction with customers to the next level. Many big brands such as Ikea, Lego, Audi, Toyota and Farber have already followed thistrend in softwaredevelopmentby building AR/VR apps, so it will unlikely take other businesses much time to do the same.

New technologies impact the way we live and do business. In this article, we mentioned only somelatest software development trendswhich we think will influence our future the most. But the best thing about tech disruptions is that you can not only observe them, you can be a part of them.

Stay updated with all the insights.Navigate news, 1 email day.Subscribe to Qrius

Originally posted here:
Innovations Shaping the Future - Qrius

Rockport Networks Announces Availability of New Switchless Network and Collaboration with TACC – HPCwire

OTTAWA, Ontario, Oct. 26, 2021 Rockport Networks today announces the commercial availability of its new switchless network architecture that delivers industry-leading performance and scalability necessary for performance-intensive computing workloads including HPC, AI and ML. Addressing the longstanding data center chokepoints that are throttling innovation, the Rockport Switchless Networkoffers a completely new design, using breakthrough software and data-routing techniques to overcome congestion, delivering predictable performance improvements of more than 3X that of centralized switch-intensive networks.

The Rockport Switchless Network distributes the network switching function to endpoint devices, where these devices (nodes) become the network. By eliminating layers of switches, the Rockport Switchless Network also significantly frees up rack space to be better used by compute and storage, as well as creating savings in associated power, cooling, and administrative overhead. No longer are compute and storage resources starving for data, and researchers have more predictability regarding workload completion time.

Rockport was founded based on the fact that switching, and networking in general, is extremely complicated. Over the years, this complexity has forced organizations to make tradeoffs when it comes to performance at scale, so we decided to make it simpler, said Doug Carwardine, CEO and co-founder, Rockport Networks. We made it our mission to get data from a source to a destination faster than other technologies. Removing the switch was crucial to achieve significant performance advantages in an environmentally and commercially sustainable way.

Rethinking network switches creates an opportunity to leverage direct interconnect topologies that provide a connectivity mesh in which every network endpoint can efficiently forward traffic to every other endpoint. The Rockport Switchless Network is a distributed, highly reliable, high-performance interconnect providing pre-wired supercomputer topologies through a standard plug-and-play Ethernet interface.

TheRockportNetwork Operating System (rNOS)is software at the core of the Rockport Switchless Network and runs on the Network Card, fully offloaded from the compute cores and server operating system. The rNOS enables the network to self-discover, self-configure and self-heal. Like a navigation app for data, this patented approach selects and continually optimizes the best path through the network to minimize congestion and latency, while breaking down packets into smaller pieces (FLITs) to ensure high-priority messages are not blocked by large messages or bulk data transfers. Designed to integrate easily into existing and emerging data centers, the Rockport Switchless Network installs in a fraction of the time required to cable traditional switch-based networks. As an embeddable architecture, it will work in any form factor. Today the software is deployed and managed using three main components:

Rockports technology delivers significant TCO, sustainability and security benefits including:

When the root of the problem is the architecture, building a better switch just didnt make sense, said Matt Williams, CTO, Rockport Networks. With sophisticated algorithms and other purpose-built software breakthroughs, we have solved for congestion, so our customers no longer need to just throw bandwidth at their networking issues. Weve focused on real-world performance requirements to set a new standard for what the market should expect for the fabrics of the future.

After an extensive beta program for cloud service providers and elite government and academic research labs, the Rockport Switchless Network is being deployed by customers including theUniversity ofTexas Advanced Computing Center (TACC).The company is also working with industry organizations includingOhio State University (OSU)to contribute to performance-intensive networking standards.

TACC Establishes New Center of Excellence with Rockport Networks

As part of todays news, Rockport Networks is announcing a collaboration with TACC to create a Center of Excellence in Austin, Texas. TACC houses Frontera, the fastest supercomputer on a university campus and the 10thmost powerful supercomputer in the world. TACC has installed 396 nodes on Frontera running production workloads on Rockports switchless network including quantum computing, pandemic-related life sciences research as well as workloads focused on rapid responses to emergencies like hurricanes, earthquakes, tornadoes, floods, and other large-scale disasters.

TACC is very pleased to be a Rockport Center of Excellence. We run diverse advanced computing workloads which rely on high-bandwidth, low-latency communication to sustain performance at scale. Were excited to work with innovative new technology like Rockports switchless network design, stated Dr. Dan Stanzione, director of TACC and associate vice president for research at UT-Austin. Our team is seeing promising initial results in terms of congestion and latency control. Weve been impressed by the simplicity of installation and management. We look forward to continuing to test on new and larger workloads and expanding the Rockport Switchless Network further into our data center.

Industry Validation

The team at Durham continues to push the bounds when it comes to uncovering next-generation HPC network technologies, said Alastair Basden, DiRAC/Durham University, technical manager of COSMA HPC Cluster. Based on a 6D torus, we found the Rockport Switchless Network to be remarkably easy to setup and install. We looked at codes that rely on point-to-point communications between all nodes with varying packet sizes where typically congestion can reduce performance on traditional networks. We were able to achieve consistent low latency under load and look forward to seeing the impact this will have on even larger-scale cosmology simulations.

Theres been significant evidence that would suggest that switchless architectures have the capacity to significantly up level application performance that traditionally has come at great cost, stated Earl C. Joseph, CEO, Hyperion Research. Making these advances more economically accessible should greatly benefit the global research community and hopefully improve expectations relative to what we can expect from the network when it comes to return-on-research and time-to-results.

Our mission is to provide the advanced computing community with standard libraries such as MVAPICH2 that support the very best possible performance available in the market. We make it a top priority to keep our libraries fresh with innovative approaches, like Rockport Networks new switchless architecture, said DK Panda, professor and distinguished scholar of computer science at the Ohio State University, and lead for the Network-Based Computing Research Group. We look forward to our ongoing partnership with Rockport to define new standards for our upcoming releases.

Availability and Additional Information

The Rockport Switchless Network is available immediately.

For additional information including full product details, please visit:https://rockportnetworks.com.

Rockport Networks Launch Activities

Following the launch of the Rockport Switchless Network, the company will participate in a number of industry events including:

Virtual Roundtable with HPCwire November 10: Rockport joins Dell, TACC, OSU and Hyperion Research for a virtual discussion: Quieting Noisy Neighbors: Keys to Addressing Congestion in Multi-workload Environments.Register here.

Rockport Networks at SC21 November 14-19: Rockport is a sponsor of the SC21 Virtual Exhibit. Rockport experts will be available to answer questions in the virtual exhibit hall and Rockport CTO Matthew Williams will present: Advanced congestion control for addressing network performance bottlenecks using a next generation interconnect, with a live Q&A. Learn more at http://www.rockportnetworks.com/sc21.

About Rockport Networks

Rockport Networks next-generation of high-performance networks unlocks the entire data center to produce more results, faster, and with better economics and environmental sustainability. Modeled after the worlds fastest supercomputers, the Rockport Switchless Network replaces centralized switch architectures with a distributed, high-performance direct interconnect that is self-discovering, self-configuring and self-healing, and that is simple and transparent to operate. By virtually eliminating congestion and latency, data center workloads can be completed significantly faster, enabling organizations to improve ROI and make critical decisions more quickly. Learn more atwww.rockportnetworks.com.

Source: Rockport Networks

Visit link:
Rockport Networks Announces Availability of New Switchless Network and Collaboration with TACC - HPCwire