Archive for the ‘Quantum Computing’ Category

The future of business: learning from the past – ITProPortal

By its very definition, progress across humanity, society, and business is about evolution. Developments and inventions are rarely unique; they are more often than not an evolution of things that already exist. As French writer Simone de Beauvoir aptly put it, to make something good of the future, you have to look the present in the face. In a business context, the evolution of both historical tools and recent trends will shape the future of how we work.

Though the future of work will always be in the future, the future of your work has never been closer. The rise of robots, machine intelligence, distributed ledgers, quantum physics, gig labour, the unexaggerated death of privacy, a world eaten alive by software all of these trends point to a new world that is shaping up quite differently from anything we have ever seen, or worked in, before.

A recent Cognizant report looked at milestone inventions over the past centuries to see how they can help to inform, and transform, future technological developments. Here we explore how the apps, systems, tools, and processes of the past and present will define the future of business.

John Leonard Riddell invented the first practical binocular microscope in 1851, changing the course of medicine forever by allowing doctors to diagnose problems at a cellular level. The medicinal microscope simultaneously made the world a better place and created an entire industry that today employs millions of people around the world.

Just as microscopes changed the course of medicine more than a century ago, artificial intelligence (AI) will function as a datascope for businesses to see more data, integrate it with other data, and ultimately, make faster decisions. New tools do not necessarily automate people out of the equation completely; they allow professionals to do things they were not previously capable of.

The future world of work will see people and technology work symbiotically, with AI allowing us to grapple with a world awash with information that is denser, more complex, and coming at us faster than ever before. In turn, AI will open new opportunities for commercial growth and levels of employment for billions, making the world an even better place.

Cloud computing is the lifeblood of both our personal and professional lives, with nearly every transaction and interaction taking place via some form of private, public, or hybrid cloud. The cloud has supercharged distributed computing that is, a system where individual computers across different locations are networked together and information is shared by passing messages between the processors. Google search engine is an example of distributed computing, as are cellular networks and intranets. But with more internet-connected devices VR headsets, health trackers, toothbrushes coming online and 5G accelerating everything, we will need more computing power.

Edge computing is the answer to this problem. A framework where data is processed as close as possible to its originating source the edge of the network rather than in centralised systems, edge computing will enable a new era of business.

In the not-too-distant future, geodistributed machine learning (GDML), or AI on the edge, will allow organisations to meet governance challenges posed by data that is born in geographically distributed places or used in dispersed locations. With reduced latency and real time responsiveness, we will see technologies such as augmented reality truly shape the enterprise realm and play a significant role in how work is performed.

Z1 the worlds first electromechanical binary programmable computer was created by German scientist Konrad Zuse in his parents living room in 1938. This humble moment kicked off the greatest technological revolution in history. Virtually everything we do in life and business is influenced by binary computing power, from the systems that run our cars to those that power modern businesses. However, these computers still operate according to one of the simplest concepts a series of ones and zeros.

Where a bit can only be either one or zero, a qubit can be both one and zero at exactly the same time. The future of business AI, machine learning, and predictive modelling will be powered by the qubit via quantum computing. And this future is in sight, with companies such as IBM, D-Wave, and Alphabet all working to develop useable quantum computers.

The future of work and business is an elusive concept that either excites or terrifies, largely due to the unknown nature of it. However, it is not so unknown, as the clues to the future actually lie in our past. In a world that will be awash with unfathomable amounts of data, we will need new tools like those that transformed our world in the past to realise the immense opportunity that is right in front of us.

Euan Davis, European Lead, Centre for the Future of Work, Cognizant

Read the rest here:
The future of business: learning from the past - ITProPortal

Atomic-Scale Resolution in Space and Time – Optics & Photonics News

By combining laser pulses (red) with scanning tunneling microscopy, researchers can achieve a temporal resolution of several hundred attoseconds when imaging quantum processes such as an electronic wave packet (colored wave) with atomic spatial resolution. [Image: Christian Hackenberger]

An elusive challenge in the field of microscopy has been to achieve concurrent atomic-scale resolution in both space and time. With a groundbreaking proof-of-concept study, researchers in Germany have demonstrated a new technique that attains this goal by combining scanning tunneling microscopy with ultrashort laser pulses (Science, doi: 10.1126/science.aaz1098).

Such an ultrafast, high-definition microscope has wide-ranging applications in areas like nanoelectronics, lightwave electronics, chemistry and quantum computing.

Current imaging techniques that are able to resolve objects at the atomic levelfor instance, scanning tunneling microscopylack the temporal resolution necessary to track electron movement. Ultrafast laser spectroscopy can measure electron dynamics at natural time scales but misses the mark in terms of spatial definition.

For his doctoral work, Manish Garg of the Max Planck Institute for Solid State Research in Germany studied femtosecond physics and found himself becoming frustrated with the low spatial resolution of the technique. As a result, he began working with electron pulses but hit a wall when trying to compress them to shorter time scales, since electrons repel each other even in a vacuum.

There are a lot of techniques where you have one resolutiontemporal or spatialand you push to get to get the other, said Garg. This has been quite a challenge and a big bottleneck in the field.

Garg and his colleague Klaus Kern, an expert in scanning tunneling microscopy, overcame this obstacle by integrating a phase-locking train of ultrafast laser pulses with a scanning tunneling microscope. For the first time, a technique was able to simultaneously probe electron dynamics in the sub-angstrom and sub-femtosecond regimes, which are the natural length and time scales of atoms.

The researchers focused optical pulses with a time duration of less than 6 femtoseconds into the apex of a nanotip in a scanning tunneling microscope junction. The pulses are precisely tuned with the same carrier-envelope phase, which creates a high electric field that lowers the tunneling barrier. Electrons tunnel between the nanotip and the sample, and by measuring the changes in intensity of this induced tunneling current, they can distinguish atomic-level dynamics in the surface at high speeds.

Garg and Kern demonstrated the power of their device by studying the carrier decay dynamics of collective oscillations of electrons in a gold nanorod.

This type of optical field-driven tunneling microscopy enables topography mapping of surfaces at the same spatial resolution as a conventional scanning tunneling microscope. The added capability to control electron tunneling at small time scales essentially transforms the microscope into a high-speed camera for the quantum world.

All electronics are shrinking to smaller dimensions,said Garg, and if you want to understand the electron dynamics happening in the small dimensions of a circuit, you should be able to do it with our technique.

Read the original post:
Atomic-Scale Resolution in Space and Time - Optics & Photonics News

How Quantum Computers Work

A quantum computer is a computer design which uses the principles of quantum physics to increase the computational power beyond what is attainable by a traditional computer. Quantum computers have been built on a small scale and work continues to upgrade them to more practical models.

Computers function by storing data in a binary number format, which result in a series of 1s & 0s retained in electronic components such as transistors. Each component of computer memory is called a bit and can be manipulated through the steps of Boolean logic so that the bits change, based upon the algorithms applied by the computer program, between the 1 and 0 modes (sometimes referred to as "on" and "off").

A quantum computer, on the other hand, would store information as either a 1, 0, or a quantum superposition of the two states. Such a "quantum bit" allows for far greater flexibility than the binary system.

Specifically, a quantum computer would be able to perform calculations on a far greater order of magnitude than traditional computers ... a concept which has serious concerns and applications in the realm of cryptography & encryption. Some fear that a successful & practical quantum computer would devastate the world's financial system by ripping through their computer security encryptions, which are based on factoring large numbers that literally cannot be cracked by traditional computers within the lifespan of the universe. A quantum computer, on the other hand, could factor the numbers in a reasonable period of time.

To understand how this speeds things up, consider this example. If the qubit is in a superposition of the 1 state and the 0 state, and it performed a calculation with another qubit in the same superposition, then one calculation actually obtains 4 results: a 1/1 result, a 1/0 result, a 0/1 result, and a 0/0 result. This is a result of the mathematics applied to a quantum system when in a state of decoherence, which lasts while it is in a superposition of states until it collapses down into one state. The ability of a quantum computer to perform multiple computations simultaneously (or in parallel, in computer terms) is called quantum parallelism.

The exact physical mechanism at work within the quantum computer is somewhat theoretically complex and intuitively disturbing. Generally, it is explained in terms of the multi-world interpretation of quantum physics, wherein the computer performs calculations not only in our universe but also in other universes simultaneously, while the various qubits are in a state of quantum decoherence. While this sounds far-fetched, the multi-world interpretation has been shown to make predictions which match experimental results.

Quantum computing tends to trace its roots back to a 1959 speech by Richard P. Feynman in which he spoke about the effects of miniaturization, including the idea of exploiting quantum effects to create more powerful computers. This speech is also generally considered the starting point of nanotechnology.

Of course, before the quantum effects of computing could be realized, scientists and engineers had to more fully develop the technology of traditional computers. This is why, for many years, there was little direct progress, nor even interest, in the idea of making Feynman's suggestions into reality.

In 1985, the idea of "quantum logic gates" was put forth by the University of Oxford's David Deutsch, as a means of harnessing the quantum realm inside a computer. In fact, Deutsch's paper on the subject showed that any physical process could be modeled by a quantum computer.

Nearly a decade later, in 1994, AT&T's Peter Shor devised an algorithm that could use only 6 qubits to perform some basic factorizations ... more cubits the more complex the numbers requiring factorization became, of course.

A handful of quantum computers has been built. The first, a 2-qubit quantum computer in 1998, could perform trivial calculations before losing decoherence after a few nanoseconds. In 2000, teams successfully built both a 4-qubit and a 7-qubit quantum computer. Research on the subject is still very active, although some physicists and engineers express concerns over the difficulties involved in upscaling these experiments to full-scale computing systems. Still, the success of these initial steps does show that the fundamental theory is sound.

The quantum computer's main drawback is the same as its strength: quantum decoherence. The qubit calculations are performed while the quantum wave function is in a state of superposition between states, which is what allows it to perform the calculations using both 1 & 0 states simultaneously.

However, when a measurement of any type is made to a quantum system, decoherence breaks down and the wave function collapses into a single state. Therefore, the computer has to somehow continue making these calculations without having any measurements made until the proper time, when it can then drop out of the quantum state, have a measurement taken to read its result, which then gets passed on to the rest of the system.

The physical requirements of manipulating a system on this scale are considerable, touching on the realms of superconductors, nanotechnology, and quantum electronics, as well as others. Each of these is itself a sophisticated field which is still being fully developed, so trying to merge them all together into a functional quantum computer is a task which I don't particularly envy anyone ... except for the person who finally succeeds.

Excerpt from:
How Quantum Computers Work

Healthcare venture investment in 2020: Quantum computing gets a closer look – Healthcare IT News

Among the healthcare technologies venture firms be looking at most closely at in 2020, various artificial intelligence and machine learning applications are atop this list, of course. But so are more nuts-and-bolts tools like administrative process automation and patient engagement platforms, VCs say.

Other, more leading-edge technologies genomics-focused data and analytics, and even quantum computing are among the areas attracting investor interest this year.

"We expect 2020 to mark the first year where health IT venture firms will start to look at quantum computing technology for upcoming solutions," Dr. Anis Uzzaman, CEO and general partner of Pegasus Tech Ventures, told Healthcare IT News.

"With the breakthrough supremacy announcement from Google validating the technology and the subsequent launch of the service Amazon Braket in 2019, there is sure to be a new wave of entrepreneurial activity starting in 2020."

He said quantum computing technology holds a lot of promise for the healthcare industry with potential breakthroughs possible throughout the health IT stack from operations and administration to security.

Among the promising companies, Uzzaman pointed to Palo Alto-based QC Ware, a startup pioneering a software solution that enables companies to use a variety of quantum hardware platforms such as Rigetti and IBM to solve a variety of enterprise problems, including those specifically related to healthcare.

He also predicted artificial intelligence would continue to be at the forefront for health IT venture firms in 2020 as it becomes more clear which startups may be winners in their initial target sectors.

"There has been consistent growth of investment activity over the past few years into healthcare startups using artificial intelligence to target a range of areas from imaging to diagnostics," he said.

However, Uzzaman also noted regulation and long enterprise sales cycles have largely slowed the ability for these companies to significantly scale their revenues.

"Therefore, we anticipate 2020 will be the year where it will become clearer to health IT venture firms who will be winners in applying artificial intelligence to imaging, pathology, genomics, operations, diagnostics, transcription, and more," he said. "We will also continue to see moderate growth in the overall investment amount in machine learning and AI companies, but will see a notable decrease in the number of companies receiving an investment.

Uzzaman explained there were already some signs in late 2019 that there could be late in a short-term innovation cycle for artificial intelligence with many companies, particularly those applying machine learning and AI to robotics, shutting down.

"However, we anticipate many companies will reach greater scale with their solutions and separate themselves from the competition, which will translate into more mega funding rounds," he said.

Ezra Mehlman, managing partner with Health Enterprise Partners, explained that at the beginning of each year, the firm conducts a market mapping exercise to determine which healthcare IT categories are rising to the top of the prioritization queue of its network of hospital and health plan limited partners.

"In the past year, we have seen budgets meaningfully open for automation solutions in administrative processing, genomics-focused data and analytics offerings, aging-in-place technologies and, in particular, patient engagement platforms rooted in proven clinical use cases," he said. "We are actively looking at all of these spaces."

He pointed out that in 2018, more than $2 billion was invested into artificial intelligence and machine learning healthcare IT companies, which represented a quarter of the total dollars invested into digital health companies that year.

"We view this as a recognition of two things: the meteoric aspirations that the market has assigned to AI and machine learning's potential, and a general sense that the underlying healthcare data infrastructure has reached the point of maturity, where it is possible to realize ROI from AI/machine learning initiatives," he said.

However, he said Health Enterprise Partners is still waiting for the "breakout" to occur in adoption.

"We believe we have now reached the point where category leaders will emerge in each major healthcare AI subsector and the usage will become more widespread we have made one such investment in the clinical AI space in the last year," Mehlman said.

Heading into 2020, Mehlman said companies that cannot deliver high-six-figure, year-one ROI in the form of increased revenue or reduced cost will struggle, and companies that cannot crisply answer the question, "Who is the buyer and what is the budget?" will be challenged.

"If one applies these tests to some of the areas that have attracted the most healthcare VC investment--social determinants of health, blockchain and digital therapeutics to name a few the number of viable companies sharply drops off," he said.

Mehlman noted that while these sound like simple principles, the current environment of rapidly consolidating, budget-constrained hospitals, vertically integrating health plans, and big tech companies making inroads into healthcare has raised the bar on what is required for a healthcare startup to gain meaningful market traction.

The rest is here:
Healthcare venture investment in 2020: Quantum computing gets a closer look - Healthcare IT News

ASC20 Finals to be Held in Shenzhen, Tasks Include Quantum Computing Simulation and AI Language Exam – HPCwire

BEIJING, Jan. 21, 2020 The 2020 ASC Student Supercomputer Challenge (ASC20) announced the tasks for the new season: using supercomputers to simulate Quantum circuit and training AI models to take English test. These tasks can be unprecedented challenges for the 300+ ASC teams from around the world. From April 25 to 29, 2020, top 20 finalists will fiercely compete at SUSTech in Shenzhen, China.

ASC20 set up Quantum Computing tasks for the first time. Teams are going to use the QuEST (Quantum Exact Simulation Toolkit) running on supercomputers to simulate 30 qubits in two cases: quantum random circuits (random.c), and quantum fast Fourier transform circuits (GHZ_QFT.c). Quantum computing is a disruptive technology, considered to be the next generation high performance computing. However the R&D of quantum computers is lagging behind due to the unique properties of quantum. It adds extra difficulties for scientists to use real quantum computers to solve some of the most pressing problems such as particle physics modeling, cryptography, genetic engineering, and quantum machine learning. From this perspective, the quantum computing task presented in the ASC20 challenge, hopefully, will inspire new algorithms and architectures in this field.

The other task revealed is Language Exam Challenge. Teams will take on the challenge to train AI models on an English Cloze Test dataset, vying to achieve the highest test scores. The dataset covers multiple levels of English language tests in China, including the college entrance examination, College English Test Band 4 and Band 6, and others. Teaching the machines to understand human language is one of the most elusive and long-standing challenges in the field of AI. The ASC20 AI task signifies such a challenge, by using human-oriented problems to evaluate the performance of neural networks.

Wang Endong, ASC Challenge initiator, member of the Chinese Academy of Engineering and Chief Scientist at Inspur Group, said that through these tasks, students from all over the world get to access and learn the most cutting-edge computing technologies. ASC strives to foster supercomputing & AI talents of global vision, inspiring technical innovation.

Dr. Lu Chun, Vice President of SUSTech host of the ASC20 Finals, commented that supercomputers are important infrastructure for scientific innovation and economic development. SUSTech makes focused efforts on developing supercomputing and hosting ASC20, hoping to drive the training of supercomputing talent, international exchange and cooperation, as well as inter discipline development at SUSTech.

Furthermore, during January 15-16, 2020, the ASC20 organizing committee held a competition training camp in Beijing to help student teams prepare for the ongoing competition. HPC and AI experts from the State Key Laboratory of High-end Server and Storage Technology, Inspur, Intel, NVIDIA, Mellanox, Peng Cheng Laboratory and the Institute of Acoustics of the Chinese Academy of Sciences gathered to provide on-site coaching and guidance. Previous ASC winning teams also shared their successful experiences.

About ASC

The ASC Student Supercomputer Challenge is the worlds largest student supercomputer competition, sponsored and organized by Asia Supercomputer Community in China and supported by Asian, European, and American experts and institutions. The main objectives of ASC are to encourage exchange and training of young supercomputing talent from different countries, improve supercomputing applications and R&D capacity, boost the development of supercomputing, and promote technical and industrial innovation. The annual ASC Supercomputer Challenge was first held in 2012 and has since attracted over 8,500 undergraduates from all over the world. Learn more ASC athttps://www.asc-events.org/.

Source: ASC

The rest is here:
ASC20 Finals to be Held in Shenzhen, Tasks Include Quantum Computing Simulation and AI Language Exam - HPCwire