Archive for the ‘Quantum Computing’ Category

Role of Quantum Computing and AI in Healthcare Industry – Analytics Insight

One of our ages major achievements in healthcare. Medical research has advanced rapidly, extending life expectancy around the world. However, as people live longer, healthcare systems face increased demand, rising expenses, and a staff that is straining to meet the needs of the patient.

Population aging, changing patients needs, a change in life choices, and the never-ending loop of innovation are just a few of the relentless forces driving demand. The consequences of an aging population stand out among these. Healthcare is one of our generations main achievements. Medical research has progressed at a breakneck pace, extending life expectancy all around the world.

When you use the classic computing method, your machine doubles in size every time the number of data doubles. Processing the vast amounts of data necessary in many areas, such as healthcare, manufacturing, big data, and financial services, is difficult and time-consuming as a result.

Quantum computing doubles the computers potentiality with each additional cubit rather than increasing the programs size. Without growing the footprint, computers can process progressively massive amounts of data in near real-time. Quantum computing is already being used in a variety of businesses with vast volumes of data to swiftly solve previously intractable tasks.

Quantum computings advantages are already being observed in healthcare, particularly in personalized medicine, where researchers and healthcare providers are working to forecast health risks and find the best therapy for groups of people who share certain features. Personalized medicine, in comparison to conventional medicine, is patient-centered care that analyses a patients genetic profile to identify health risks and provide therapies that are tailored to their specific needs.

Specialists in the burgeoning sector are increasingly depending on quantum computers unique capacity to tackle complicated data managerial challenges with high speed in order to effectively process enormous amounts of health data from millions of disparate data points. This is in favor of customized medicines development and its favorable impact on healthcare systems.

Researchers discussed their efforts to develop policies that address critical concerns about emerging technologies, highlighting the distinctions between capacity-building basic open basic and applied competitive study with direct state defense and commercial ramifications.

Foster discussed impending legislation that will expand the National Quantum Initiative by assisting in the creation of a larger pool of workers with the highly specialized skills required. The money will be used to boost military training as well as quantum-related college programs. The goal is to strengthen the Department of Defenses quantum staff, which will aid in the attempt to harness quantums power and speed to solve the most difficult problems.

Dr. Paul Lopata, Ph.D., Principal Head for Quantum Science, shared his thoughts on what businesses should be doing now to set themselves up for future quantum success. He emphasized that high-performance computing is made up of supercomputers, application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), and GPUs, rather than a single technique.

According to Lopata, businesses should think about the long game with quantum computing and begin thinking about the future now. In quantum computing, he revealed his 3 phases to long-term thinking:

1. Adhere to the values of your company

2. Develop your own specialty

3. Collaborate with organizations that share your values.

Quantum computing can be one of several game-changing technologies that help us improve our ability to assure healthy lives and encourage well-being for people of all ages, as well as help us build a more long-term sustainable society. Quantum computing combined with artificial intelligence allows us to address some of todays most pressing concerns while also creating re-creatable and scalable technology foundations and procedures as we strive toward global healthcare for all.

The applications that have an impact on care delivery, such as how existing tasks are completed and how they are disturbed by changing healthcare requirements or the processes necessary to fulfill them. From day-to-day operational improvement in clinical organizations to population-health management and the realm of healthcare technology, applications that support and develop healthcare delivery. Its a broad term that encompasses natural language processing (NLP), image processing, and machine learning-based predictive analytics.

While there are many issues about what is actually in AI in healthcare nowadays, this paper examined 23 applications currently in use and presents case studies for 14 of them. These examples show how AI can impact a wide range of domains, from applications that help patients control their own treatment to online symptom detectors and e-triage AI systems, virtual assistants that can perform duties in hospitals, and bionic pancreas to assist diabetic patients.

Read more:
Role of Quantum Computing and AI in Healthcare Industry - Analytics Insight

November: Smart Internet Lab | News and features – University of Bristol

The University of Bristols pioneering Smart Internet Lab will work with industry partners to develop the first blueprint for a quantum data centre, as part of UKRIs 170 million Commercialising Quantum Technologies Challenge.

Quantum technologies, in the form of quantum computing and communications, promise to provide solutions to some of the world's most challenging problems. However, to date, very little has been understood from a systems perspective about how to integrate them with existing data centres.

The Quantum Data Centre of the Future project will commence in early 2022, bringing experts in classical data centres and networking together with experts in quantum computing and quantum communications, to develop the first blueprint for a quantum data centre.

The project will leverage the significant research strengths of the University of Bristols High Performance Networks Group in classical data centre, quantum Internet and quantum networking.

Professor Reza Nejabati, Head of High Performance Networks Research Group in the Smart Internet Lab said: This is a truly exciting initiative. Adapting quantum computing and networksystems to work in a data centre settingwill require significant acts of invention and creativity.

"This will bring a more practical light to the field of quantum technologies so they can benefit businesses and support the emergence of new type quantum computing algorithms and applicationsthat will benefit from them far into the future.

In collaboration with the project partners, we aim to design, develop and demonstrate a solution for integrating a quantum computer in a classical data centre as well as providing remote quantum secure access to quantum computers at scale and in a data centre setting.

Professor Dimitra Simeonidou, Director of Smart Internet Lab said: Quantum computers and communications systems are often described in isolation, but this misses the possibility for near term value to be created with quantum/classical hybrid systems. In this project, we will be investigating system-level solutions for optical metro quantum networks supporting remote access to quantum computing.

We are really excited to work with leading industrial and academic partners to connect and integrate our city scale test-bed to remote quantum accelerated data canter and demonstrate its use for future industrial applications.

See more here:
November: Smart Internet Lab | News and features - University of Bristol

Pistoia Alliance predicts a focus on the fight against antimicrobial resistance and a surge in quantum computing research for 2022 – Bio-IT World

Boston, US, 22 November 2021: The Pistoia Alliance, a global, not-for-profit alliance that advocates for greater collaboration in life sciences R&D, has today outlined predictions for the life sciences industry in 2022. The predictions come from three experts recently appointed by the Alliance to drive collaboration efforts across its three key themes. Their insights span the urgent fights against antimicrobial resistance, the potential of quantum computing and commercial space travel, and autonomous laboratories. Throughout 2021, digital transformation has continued to accelerate and the pharmaceutical industry has further embraced collaboration, both of which will underpin success in emerging areas in the next 12 months.

Linda Kasim, Empowering the Patient theme lead, Pistoia Alliance: In 2022, the renewed focus on the fight against super bugs and antimicrobial resistance (AMR) will be prioritized. This will be primarily driven by public-private partnerships, funding from philanthropic organizations, governments and international bodies to incentivize research. The public sector must quickly increase investment into AMR research, or the cost to national economies and public health could be devastating. mRNA technologies will represent a rapid and valuable platform to be further exploited for vaccines against AMR infections.

Digital health platforms will also be more integrated seeking efficiency through harmonized data generation. The use of Self-Sovereign Identities within healthcare solutions will expand. For these breakthroughs to happen, regulatory authorities must catch-up with the pace of research and innovation in health systems in 2022 by updating legal frameworks.

Imran Haq, Emerging Science and Technology theme lead, Pistoia Alliance: Driven by macro geopolitical trends and Big Tech, emerging technologies are being developed increasingly rapidly. Reflecting this, deal making in the quantum space will continue to grow a pace in 2022. As the buzz around the sector increases, will this be the year we finally start to see translation of this buzz into early versions of applications and use cases in the pharma industry? A likely quantum use case could be to improve supply chain efficiency. Big promises have been made during COP26, and large organizations, including pharma companies, must have net zero strategies. This is also an area we would like to explore with the Pistoia Alliances Quantum Computing Community of Interest.

Pharma is also going to play an increasingly critical role in space exploration. As plans to launch a commercial space stationfrom companies like Blue Origin accelerate, pharma should be engaged to ensure humans are healthy and can survive in the long term in extreme environments. 2022 is the time to think how we could be molding and driving forward health in space.

Anca Ciobanu, Improving the Efficiency and Effectiveness of R&D theme lead, Pistoia Alliance: Efficiency in R&D is on an exponential growth path as more pharma and biotech organizationspartner with AI and robotics companies, enabling a more automated drug discovery process. In 2022, the major tech players will increase their focus on the life sciences and will play an important role in developing new products and initiatives. The application of new technologies will not only empower scientists to conduct experiments more efficiently, but it will also help them make more breakthrough discoveries. As companies continue to invest resources in launching or improving their autonomous labs, researchers will need upskilling in data science, to be able to program and interact with themachines.

The Pistoia Alliance has more than 150 member companies including major life science companies, technology and service providers, academic groups, publishers, and patient research groups. Members collaborate as equals on projects that generate value for the worldwide life sciences and healthcare ecosystem. To find out more about the Alliance and its portfolio of projects, click here: https://www.pistoiaalliance.org/category/projects/.

--ENDS

About The Pistoia Alliance:

The Pistoia Alliance is a global, not-for-profit members organization made up of life science companies, technology and service providers, publishers, and academic groups working to lower barriers to innovation in life science and healthcare R&D. It was conceived in 2007 and incorporated in 2009 by representatives of AstraZeneca, GSK, Novartis and Pfizer who met at a conference in Pistoia, Italy. Its projects transform R&D through pre-competitive collaboration. It overcomes common R&D obstacles by identifying the root causes, developing standards and best practices, sharing pre-competitive data and knowledge, and implementing technology pilots. There are currently over 150 member companies; members collaborate on projects that generate significant value for the worldwide life sciences R&D community, using The Pistoia Alliances proven framework for open innovation.

Media Contacts:

Spark Communications

+44 207 436 0420

pistoiaalliance@sparkcomms.co.uk

Tanya Randall

The Pistoia Alliance

+44 7887 811332

tanya.randall@pistoiaalliance.org

Read this article:
Pistoia Alliance predicts a focus on the fight against antimicrobial resistance and a surge in quantum computing research for 2022 - Bio-IT World

Why Blockchain isnt as secure as you think – Evening Standard

B

lockchain has rapidly become one of the most disruptive technologies of the 21st century, but with the continuous improvements in quantum computing, the foundations of the technology are starting to falter.

Blockchain, cryptocurrencies, NFTs and decentralised finance have become common terms, with blockchain now hailed as an extremely secure and much faster method of recording transactions due to the computational intensity of attempting to break it. Both companies and people have poured endless amounts of capital into the technology by buying cryptocurrencies or by developing their own currency or asset chains.

But in a dynamic cyber environment, is this $2.7 trillion dollar market really future-proof and secure?

With every innovation in quantum computing, the threat to blockchain increases.

There are two main issues that face the technology, the first being its reliance on a form of encryption known as public key cryptography; and second, its reliance on a type of algorithm called a hash function.

Public key cryptography is a method of encryption that publishes a key for the world to use so that they can encrypt information that only the holder of the private key can see.

A hash is generated by running a widely known and well-established algorithm on a piece of information to create a near unique digital representation of it. It is computationally impossible to construct the original information from a hashed representation, and they are said to be resistant to finding another piece of data that has the exact same digital representation. In both proof-of-work and proof-of-stake blockchains, digitally signed hashes are used in combination with random numbers to sign off a block.

The threat from quantum computing to public key encryption is a known issue and has been discussed at length by many experienced professionals. It is an issue that both governments and commercial entities have recognised. NIST, the US National Institute of Standards and Technology, is currently in the process of defining what the next phase of encryption (also known as post-quantum encryption) will be. Many experts will highlight that the types of quantum computers that are capable of cracking this are still far away, which is true, but various competing technologies alongside quantum are bringing this to the forefront of the cybersecurity threat vector.

Therefore, one can see that the main near-term issue facing the chain comes from the threat to the hashing algorithm from quantum computing or quantum accelerated hardware. There are a few issues with the hash-method, however, the main issue facing these chains is that a quantum computer will be able to solve for these hashes at a much faster rate than any computational-based approach, thereby taking ownership of a network. Significant progress has been made in the past two years on a type of quantum algorithm called Grovers algorithm, which poses the greatest risk to the network as a fully well error-corrected quantum computer is not needed.

Evaluating and understanding the risk only gets us part way, says David Worrall, co-founder of Secqai. It is now time to implement the solutions available to prepare us for the future.

This risk is further accentuated due to the decentralised nature of blockchain, where the latest cyber technology hasnt been built to integrate easily with, for example, new hardware based cryptography such as secure entropy sources or quantum random number generators.

Indeed, research has shown that the deployment of post quantum safe algorithms in todays blockchain architectures is not possible without a huge increase in transaction costs sometimes outweighing the value of the transaction.

Conversely, traditional banking infrastructure is relatively easy to update as the back-end software and hardware is managed centrally by each bank and each integrated party, i.e. the list of parties that need to be secure is well known.

Blockchain developers understand the challenge today, and as has been shown need to start the work of preparing their systems by integrating post-quantum methods into their infrastructure and adopt best practice techniques to ensure that they are prepared for a quantum world.

Rahul Tyagi is an ex-management consultant, inventor and co-founder of cyber security start-up Secqai

Original post:
Why Blockchain isnt as secure as you think - Evening Standard

IBM creates largest ever superconducting quantum computer – New Scientist

IBM has made a 127-qubit quantum computer. This is over double the size of comparable machines made by Google and the University of Science and Technology of China

By Matthew Sparkes

IBM claims it has created the worlds largest superconducting quantum computer, surpassing the size of state-of-the-art machines from Google and from researchers at a Chinese university. Previous devices have demonstrated up to 60 superconducting qubits, or quantum bits, working together to solve problems, but IBMs new Eagle processor more than doubles that by stringing together 127.

Several approaches are being pursued by teams around the world to create a practical quantum computer, including superconductors and entangled photons, and it remains unclear which will become the equivalent of the transistor which powered the classical computing revolution.

In 2019, Google announced that its Sycamore processor, which uses the same superconducting architecture that IBM is working with, had achieved quantum supremacy the name given to the point at which quantum computers can solve a problem that a classical computer would find impossible. That processor used 54 qubits, but has since been surpassed by a 56 and then 60-qubit demonstration with the Zuchongzhi superconducting processor from the University of Science and Technology of China (USTC) in Hefei.

IBMs 127-qubit Eagle processor now takes the top spot as the largest, and therefore theoretically most powerful, superconducting quantum computer to be demonstrated. Each additional qubit represents a significant step forward in ability: unlike classical computers, which rise in power in a linear fashion as they grow, one additional qubit effectively doubles a quantum processors potential power.

Canadian company D-Wave Systems has sold machines for some years that consist of thousands of qubits, but they are widely considered to be very specific machines tailored towards a certain algorithm called quantum annealing rather than fully programmable quantum computers. In recent years, much progress in quantum computing has focused on superconducting qubits, which is one of the main technologies that Google, USTC and IBM are backing.

Bob Sutor at IBM says that breaking the 100-qubit barrier is more psychological than physical, but that it shows the technology can grow. With Eagle, were demonstrating that we can scale, that we can start to generate enough qubits to get on a path to have enough computation capacity to do the interesting problems. Its a stepping stone to bigger machines, he says.

However, it is difficult to compare the power of the IBM chip with previous processors. Both Google and USTC used a common test to assess such chips, which was to simulate a quantum circuit and sample random numbers from its output. IBM claims to have created a more programmable and adaptable processor, but has yet to publish an academic paper setting out its performance or abilities.

Peter Leek at the University of Oxford says it is tempting to assess performance entirely on the qubit count, but that there are other metrics that need to be looked at none of which has yet been released for Eagle. Its definitely positive, its good that theyre making something with more qubits, but ultimately it only becomes useful when the processor performs really well, he says.

Scott Aaronson at the University of Texas at Austin has similar reservations about judging the importance of the new processor at this stage, saying that more detail is needed. I hope that information will be forthcoming, he says.

IBM has said that it hopes to demonstrate a 400-qubit processor next year and to break the 1000-qubit barrier the following year with a chip called Condor. At that point, it is expected that a limit on expansion will be reached that requires quantum computers to be created from networks of these processors strung together by fibre-optic links.

More on these topics:

Read more:
IBM creates largest ever superconducting quantum computer - New Scientist