Archive for the ‘Quantum Computing’ Category

Panelists explore ‘Science of the Very, Very Small’ | Cornell Chronicle – Cornell Chronicle

From a nanoscale brobot flexing its muscles to a discussion of the artistry of scientific images, participants at a March 9 event got an up-close look at how quantum science and nanotechnology are shaping our lives.

Arts Unplugged: Science of the Very, Very Small included both online and in-person activities, centered around 11 TED-style talks given by faculty members in the College of Arts and Sciences. The faculty shared their research and thoughts on topics from gene manipulation and miniature robots to ethical considerations of nanotech and the interplay between science and fiction through an online eCornell presentation, which was also livestreamed to audiences in the Groos Family Atrium in Klarman Hall and the Clark Atrium in the Physical Sciences Building.

Members of the Cornell community attempt some origami during the event.

Im a particle physicist and particle physicists like to take things apart until we find their smallest constituents, pulling them apart until theres nothing indivisible anymore, said Peter Wittich, professor of physics (A&S) and director of the Laboratory of Elementary Particle Physics, echoing the Arts Unplugged theme.

The event brought together scientists and humanists from numerous fields, including physics, chemistry, biology, literature and moral psychology. Natalie Wolchover, senior science editor and writer at Quanta Magazine and the Zubrow Distinguished Visiting Journalist in A&S, served as moderator, asking questions of each of the panelists after their presentations.

We tend to think of ourselves as small in the grand scheme of things, and understandably so: The universe is huge, Wolchover said. Youd have to line up 500 trillion trillion humans head to toe to stretch across the observable universe. And yet, somehow its still easier to conceive of how much bigger space is compared to us than it is to imagine how small the smallest things in the universe are relative to us.

John Marohn, professor of chemistry and chemical biology (A&S), is building a microscope that can image things smaller than a nanometer the size of an individual water molecule.

Were using this to image spins in quantum materials to study quantum computing, but were also using this to image the molecules of life, he said.

Roald Hoffmann, the Frank H. T. Rhodes Professor Emeritus in the Department of Chemistry and Chemical Biology (A&S), spoke of the beauty of scientific images, showing an image of the nanoworld that he compared to a chocolate wafer and another that looked like sand dunes.

These are images that convey information, but they also have some significance just as images, Hoffmann said. The scientists are making artistic choices they dont think they are, but they are in the process of showing them.

Ailong Ke, professor of molecular biology and genetics (A&S), talked about how CRISPR technology can be used to combat disease.

In the next phase of our study, we really hope to bring [about CRISPRs] therapeutic power, Ke said, which could be used to delete viruses from our genome or halt the growth of cancer cells.

Directly after Kes talk, Julia Markovits, associate professor of philosophy (A&S), discussed the slippery slope argument that is often used to justify prohibiting a new technology like gene editing. If we use CRISPR to cure sickle cell anemia, for example, applying the slippery slope means that designer babies would inevitably follow. But such reasoning is faulty, said Markovits: a better metaphor would be a string of dominoes where its difficult to topple the entire string.

Eun-Ah Kim, professor of physics (A&S), talked about her work studying social phenomena of electrons. We are a lucky generation because we can see these electron spins implemented into a set of qubits that can be individually controlled, to be programmed for computation, in quantum computers, she said. In my research, I try to bridge this nascent technology with more established classical computing.

Other A&S faculty presenting included:

During the programs intermission, Michael Reynolds, M.S. 17, Ph.D. 21, postdoctoral associate in the Smith School of Chemical and Biomolecular Engineering in the College of Engineering, demonstrated an origami model of a nanobot for viewers to try.

About 30 students from the Milstein Program in Technology and Humanity attended the livestream in Clark Atrium and attempted the origami duck, with some in-person help from Reynolds. Those attending the livestream in Klarman Hall were assisted by Qingkun Liu, postdoctoral researcher in physics. Origami design principles are used by researchers, including Reynolds and Liu, working with Cohen and Paul McEuen, the John A. Newman Professor of Physical Science (A&S), as they create tiny programmable robots, including the brobot.

The intermission time also included the announcement of winners of the Colleges Envisioning the Future contest Lucca Schwartz in the elementary category; Avalon Golden and Sophia Schumaecker in the high school category and Vinh Truong in the adult category. Their winning submissions can be found online on the Arts Unplugged website.

The recording of the event is available to watch for free on eCornell.

Excerpt from:
Panelists explore 'Science of the Very, Very Small' | Cornell Chronicle - Cornell Chronicle

Developing an AI-ready business era is the need of the hour – Times of India

Human beings increasingly desire to live in a world of perfection, which has become attainable thanks to advancements in technology. We have now reached the point of complete automation, thanks to the most recent technical advancements. Artificial Intelligence (AI) is one such breakthrough that is godsend for businesses of all types to innovate in this cut-throat competitive period. Artificial Intelligence appears to be on the verge of a breakthrough, having turned a crisis into an opportunity for a variety of businesses across sectors and fundamentally altered the dynamics of the commercial world.

Many polls conducted by specialists across disciplines have found that AI is adding feathers to the caps of various expanding businesses, and as a result, organizations have become more open to incorporating AI technology into their working paradigm. However, when it comes to defining AI, there is still a lot of unanswered questions. An intelligent organism developed by humans with a capability to understand the term technology, is a brief definition of Artificial Intelligence. Many veterans have portrayed Artificial Intelligence as a terminator-like figure capable of acting and thinking on its own and carrying out missions without being taught.

When it comes to developing a solid business strategy, AI plays a vital role. In order to get better results, various sectors have included AI into their business strategy. When it comes to the industrial industry, AI skills have recently taken centre stage. While AI has been deployed in essential aspects of the business, manufacturing businesses have focused the majority of their efforts on fundamental production processes such as product creation, engineering, and assembly, as well as quality testing. Artificial Intelligence has shown to be a game-changer in the industrial industry. It has the potential to change the ROI of industrial operations across the board, regardless of genre.But, as previously stated, the true revolution will occur once all stakeholders recognise the importance of technology in redefining the new business era. When it comes to connecting consumers with profitable solutions and efficient products, digital innovation has paved the way for businesses.

Due to several concerns about technological challenges, the Indian industrial industry has been hesitant to adopt digitalization. However, with the widespread acceptance of digitalization internationally, India has caught the changing and evolving winds in order to stay afloat.

Technology pushing digitalisation

When it comes to industrial automation, technological breakthroughs such as the Internet of Things, connectivity, open software, and electronics have been implemented first. The availability, stable reliability, and performance of these digitalization technologies have been a prime and immediate reason for their adoption for automation and control of industrial manufacturing in the manufacturing sector, which thrives on precision and mission-critical applications and is heavily bound by forward and backward synchronised processes. The manufacturing industry has experienced an increase in demand for customised technical advances to help them improve their processes, as they perceive reliability in its applications across many product categories, which improves time efficiency in industrial controls and automation.

This is just the beginning; industry executives, although being pioneers in their areas, are welcoming digitalization with open arms. This shift in attitude has occurred since digitalization has proven to be a lifesaver in the face of the current pandemic. Manufacturing organisations have been able to maintain and hold their position on supply chain and production targets through time-bound deliveries across global areas, particularly in Asia, thanks to digitalization. Only now, as the benefits of digitalization in automation have become clear to company leaders and forerunners in the field, has it begun to be widely implemented. Given the current upheaval produced by Covid, the industrys perspectives on the benefits of digitalization have been sharpened.

The global manufacturing sector is concentrating on digitalization to improve customer centricity, increase the efficiency of marketing strategies, and channel attractive market prospects through simple and ready-to-use pathways of established procedures. Production will benefit from digitalization in terms of planning, operation, and maintenance.

When it comes to digital hauls for industry,quantum computingis picking frills as well. Companies believe that using exponential data processing in research and development will improve process efficiency while also saving time and resources. Details of highly complex chemical reaction processes can be digitally simulated and analysed in a short amount of time with the help of quantum computing.

Traditional automation suppliers fought hard against the adoption of Ethernet networks and Microsoft support, but they gradually recognised and accepted it. External technologies have increased the pace of production in the industrial business, increasing competitiveness. To maintain competition, it blends industrial automation and business information into computer architecture. A decade ago, no one imagined that technology would be welcomed by the rigorously mechanical industrial manufacturing industry in order to improve production efficiency. However, production efficacy is the key to survival.

Today, the manufacturing sectors future appears brighter; it does not seem to be an unusual ripped page from the book of business synchronicities. Change is never easy to accept, adopt, or execute, but for the industrial manufacturing business, it is necessary to gather fresh sails in order to reach a new shore.

Views expressed above are the author's own.

END OF ARTICLE

Read more:
Developing an AI-ready business era is the need of the hour - Times of India

The Explosive Quantum Computing Stock That Could Save the World – InvestorPlace

Im a history junkie. So, in this special Sunday issue of Hypergrowth Investing, let me start by sharing an interesting story from history that I bet a lot of you have never heard before but which, interestingly enough, could be the key to enabling you to make money in this tough market.

Back in October of 1927, the worlds leading scientists descended upon Brussels for the fifth Solvay Conference an exclusive, invite-only conference dedicated to discussing and solving the outstanding preeminent open problems in physics and chemistry.

In attendance were scientists that, today, we praise as the brightest minds in the history of humankind.

Albert Einstein was there so was Erwin Schrodinger, who devised the famous Schrodingers cat experiment and Werner Heisenberg, the man behind the world-changing Heisenberg uncertainty principle and Louis de Broglie. Max Born. Neils Bohr. Max Planck.

The list goes on and on. Of the 29 scientists who met in Brussels in October 1927, 17 of them went on to win a Nobel Prize.

These are the minds that collectively created the scientific foundation upon which the modern world is built.

And yet, when they all descended upon Brussels nearly 94 years ago, they got stumped by one concept one concept that for nearly a century has remained the elusive key to unlocking the full potential of humankind.

And now, for the first time ever, that concept which stumped even Einstein is turning into a disruptive reality, via a breakthrough technology that will change the world as we know it, and potentially even save it from a global war.

So what exactly were Einstein, Schrodinger, Heisenberg, and the rest of those Nobel Laureates talking about in Brussels back in 1927?

Quantum mechanics.

Now, to be clear, quantum mechanics is a big, complex topic that would require 500 pages to fully understand, but heres my best job at making a Cliffs Notes version in 500 words instead

For centuries, scientists have developed, tested, and validated the laws of the physical world which are known as classical mechanics. These laws scientifically explain how things work. Why they work. Where they come from. So on and so forth.

But the discovery of the electron in 1897 by J.J. Thomson unveiled a new, subatomic world of super-small things that didnt obey the laws of classical mechanics at all. Instead, they obeyed their own set of rules, which have since become known as quantum mechanics.

The rules of quantum mechanics differ from the rules of classical mechanics in two very-weird, almost-magical ways.

First, in classical mechanics, objects are in one place, at one time. You are either at the store, or at home.

But, in quantum mechanics, subatomic particles can theoretically exist in multiple places at once before they are observed. A single subatomic particle can exist in point A and point B at the same time, until we observe it, at which point it only exists at either point A or point B.

So, the true location of a subatomic particle is some combination of all its possible locations.

This is called quantum superposition.

Second, in classical mechanics, objects can only work with things that are also real. You cant use your imaginary friend to help move the couch. You need your real friend to help you.

But, in quantum mechanics, all of those probabilistic states of subatomic particles are not independent. Theyre entangled. That is, if we know something about the probabilistic positioning of one subatomic particle, then we know something about the probabilistic positioning of another subatomic particle meaning that these already super-complex particles can actually work together to create a super-complex ecosystem.

This is called quantum entanglement.

So, in short, subatomic particles can theoretically have multiple probabilistic states at once, and all those probabilistic states can work together again, all at once to accomplish some task.

And that, in a nutshell, is the scientific breakthrough that stumped Einstein back in the early 1900s.

It goes against everything classical mechanics had taught us about the world. It goes against common sense. But its true. Its real. And, now, for the first time ever, we are leaning how to harness this unique phenomenon to change everything about everything

Mark my words. Everything will change over the next few years because of quantum mechanics and some investors are going to make a lot of money.

The study of quantum theory has made huge advancements over the past century, especially so over the past decade, wherein scientists at leading technology companies have started to figure out how to harness the magical powers of quantum mechanics to make a new generation of super quantum computers that are infinitely faster and more powerful than even todays fastest supercomputers.

Again, the physics behind quantum computers is highly complex, but heres my Cliffs Notes version

Todays computers are built on top of the laws of classical mechanics. That is, they store information on what are called bits which can store data binarily as either 1 or 0.

But what if you could harness the power of quantum mechanics to turn those classical bits into quantum bits or qubits that can leverage superpositioning to be both 1 and 0 data stores at the same time?

Even further, what if you could take those quantum bits and leverage entanglement to get all of the multi-state bits to work together to solve computationally taxing problems?

You would theoretically create a machine with so much computational power that it would make even todays most advanced supercomputers look like they are from the Stone Age.

Thats exactly what is happening today.

Google has built a quantum computer that is about 158 million times faster than the worlds fastest supercomputer.

Thats not hyperbole. Thats a real number.

Imagine the possibilities if we could broadly create a new set of quantum computers 158 million times faster than even todays fastest computers

Wed finally have the level of AI that you see in movies. Thats because the biggest limitation to AI today is the robustness of machine learning algorithms, which are constrained by supercomputing capacity. Expand that capacity, and you get infinitely improved machine learning algos, and infinitely smarter AI.

We could eradicate disease. We already have tools like gene editing, but the effectiveness of gene editing relies of the robustness of the underlying computing capacity to identify, target, insert, cut, and repair genes. Insert quantum computing capacity, and all that happens without an error in seconds allowing for us to truly fix anything about anyone.

We could finally have that million-mile EV. We can only improve batteries if we can test them, and we can only test them in the real-world so much. Therefore, the key to unlocking a million-mile battery is through cellular simulation, and the quickness and effectiveness of cellular simulation rests upon the robustness of the underlying computing capacity. Make that capacity 158 million times bigger, and cellular simulation will happen 158 million times faster.

The economic opportunities here are truly endless.

But so are the risks

Did you know that most of todays cybersecurity systems are built on top of maths-based cryptography? That is, they protect data through encryption that can only be cracked through solving a super-complex math problem. Today, that works, because classical computers cannot solve those super-complex math problems very quickly.

But quantum computers that are 158 million times faster than todays classical computers will be able to solve those math problems in the blink of an eye. Therefore, quantum computers threaten to obsolete maths-based cryptography as we know it, and will compromise the bulk of the worlds modern cybersecurity systems.

Insiders call this the Quantum Threat. Its a huge deal. When the Quantum Threat arrives, no digital data will be safe.

Back in 2019, computer scientists believed the Quantum Threat to be a distant threat something that may happen by 2035. However, since then, rapid advancements in quantum computing capability have considerably moved up that timeline. Today, many experts believe the Quantum Threat will arrive in the 2025 to 2030 window.

That means the world needs to start investing in quantum-proof encryption today and thats why, from an investment perspective, we believe quantum encryption stocks will be among the markets biggest winners in the 2020s.

The global information security market is tracking towards $300 BILLION. That entire market will have to inevitably shift towards quantum encryption by 2030. Therefore, were talking the creation of a $300 billion market to save the planet from a security meltdown.

And, at the epicenter of this multi-hundred-billion-dollar, planet-saving megatrend, is one tiny startup that is pioneering the single most robust quantum encryption technology platform that world has ever seen

This company is working with the U.S. government, the UK government, and various other defense and intelligence agencies to finalize its breakthrough technology platform. The firm plans to launch the quantum encryption system, globally, in 2023.

If the tech works at scale, this tiny stock which is trading for less than $20 will roar higher by more than 10X by 2025.

And guess what? We just bought this stock in our flagship investment research product, Innovation Investor.

Trust me. This is a stock pick you are not going to want to miss it may be the single most promising investment opportunity Ive come across over the past few years.

And, with a war raging on in Europe for the first time since World War II, the economic and political importance of this stock has never been bigger.

To gain access to that stock pick and a full portfolio of other potential 10X tech stock picks for the 2020s click here.

On the date of publication, Luke Lango did not have (either directly or indirectly) any positions in the securities mentioned in this article

See more here:
The Explosive Quantum Computing Stock That Could Save the World - InvestorPlace

NIST Set to Announce Round 3 Post-Quantum Cryptography (PQC) Selections Within the Next Few Weeks – Quantum Computing Report

In December 2016, the U.S. National Institute of Standard and Technology (NIST) announced a competition to select new quantum resistant public key encryption algorithms that would eventually supersede the classical RSA and other public key cryptography algorithms that may be vulnerable to future quantum computers. For the past five years they have been receiving nominations, holding conferences, and going through three rounds of selection to determine which ones to recommend based upon security, performance, and other factors. They are very close to completing Round 3 and will announce their initial selections of new algorithms to recommend. Some algorithms still need more study and there will be a Round 4 to see if any additional ones should be standardized too. In the chart below, the algorithms shown as Finalists are being considered for standardization in Round 3 and the algorithms shown as Alternates are being considered for further analysis and possible standardization in Round 4.

Once the Round 3 selections are announced, NIST will publish a report explaining their decisions. After that, there will still be additional work to draft the standards, call for public comments, and the selections probably wont be officially formalized until 2024. But we see these as activities as formalities that wont create any significant changes. In addition, the Round 4 analysis and recommendation activities will take 12-18 months to complete after the Round 4 candidates are announced.

When we listen to presentations from various consultants and quantum computing providers, we often hear the message that enterprises should start investigating quantum computing now or else they will be left behind. But it is our view that it is just as important, if not more, for enterprises to allocate resources and start right now planning how to migrate their entire digital communications infrastructure to use quantum resistant encryption techniques. Although it may take another 10 years or so before a large enough quantum computer is available to run Shors algorithm and break the current public key algorithms, experience has shown that it takes 10 years or more to implement new encryption technology in the thousands of computers and software programs that are in use within a typical enterprise.

For those CIOs who experienced the intensive Y2K conversion activities twenty years ago, this migration will likely be significantly more complex. The number of computers, smartphones, IoT, and other digital devices in use today is orders of magnitude higher than it was earlier this century. Also, while Y2K had a specific deadline of December 31, 1999, no one really knows when the large, powerful quantum machines will be in operation. In addition, any communications of long shelf-life data may be vulnerable to a Harvest Now, Decrypt Later attack that accelerates the time frame when quantum resistant encryption is needed. So, enterprises planning a strategy have some important questions to answer such as:

With the pending announcement of the first selected algorithms from NIST, now would be the time to get going if you havent started already. For additional information on this topic, we recommend reading a white paper from the Quantum Economic Development Consortium (QED-C) titled A Guide to a Quantum-Safe Organization. You can also visit the Post-Quantum Cryptography website maintained by NIST which contains an archive of the submissions, presentations, workshops and events that have occurred during this program.

March 5, 2022

See the rest here:
NIST Set to Announce Round 3 Post-Quantum Cryptography (PQC) Selections Within the Next Few Weeks - Quantum Computing Report

NATO and White House recognize post-quantum threats and prepare for Y2Q – VentureBeat

Join today's leading executives online at the Data Summit on March 9th. Register here.

Over the past decade, encryption has emerged as one of the key solutions that organizations use to secure enterprise communications, services and applications. However, the development of quantum computing is putting these defenses at risk, with the next generation of computers having the capability to decrypt these PKC algorithms.

While quantum computing technology is still in its infancy, the potential threat of PKC decryption remains. Yesterday, the NATO Cyber Security Center (NCSC) announced that it had tested a post-quantum VPN provider by U.K.-based quantum computing provider Post-Quantum, to secure its communication flows.

Post-Quantums VPN uses quantum cryptography that it claims is complex enough to prevent a malicious quantum computer from decrypting transmissions.

The development of these post-quantum cryptographic solutions offers a solution that enterprises and technical decision makers can use to protect their encrypted data from quantum computers.

NATO isnt alone in taking post-quantum cyber attacks seriously. The U.S. National Institute of Standards and Technology (NIST) recently announced that it was developing a standard to migrate to post-quantum cryptography to begin replacing hardware, software, and services that rely on public-key algorithms.

At the same time, the White House is also concerned over the threat raised by post-quantum computing, recently releasing a National Security Memorandum which gave the National Security Agency (NSA) 30 days to update the Commercial National Security Algorithm Suite (CNSA Suite) and to add quantum-resistant cryptography.

The memorandum also noted that within 180 days, agencies that handle national security systems must identify all instances of encryption not in compliance with NSA-approved Quantum Resistant Algorithms and chart a timeline to transition these systems to use compliant encryption, to include quantum resistant encryption.

While quantum computers arent capable of decrypting modern public key algorithms like RSA, Post-Quantums CEO Andersen Cheng believes that as quantum technology develops we will reach a Y2Q scenario, where all these security measures are obsolete in the face of the computational power of weaponized quantum computers.

People frequently talk about commercial quantum computers when referencing this Y2Q moment, and thats a long way off potentially 10-15 years away. But from a cybersecurity perspective, were not talking about slick commercial machines; a huge, poorly functioning prototype in the basement is all thats needed to break todays encryption, Cheng said.

It does not need to go through any benchmark review or certification, and this prospect is much closer and it could happen within the next three to five years, Cheng said.

If Cheng is correct that non-commercial quantum computing solutions could be developed to weaponize quantum computing in just a few years, then organizations have a fine timeline to enhance their encryption protections, or they risk handing malicious entities and nation-states a skeleton key to their private data.

However, its not just data that exposed post-Y2Q thats at risk; potentially any data encrypted data thats been harvested in the past could then be unencrypted as part of a retrospective attack.

Quantum decryption can be applied retrospectively, in that the groundwork for a harvest now, decrypt later attack could be laid today. This means that, if a rogue nation-state or bad actor intercepted data today, they could decrypt this harvested data once quantum computers capabilities exceed those of classical computers, he said.

As more enterprises recognize the need for quantum cryptography in a post-quantum world, the post-quantum cryptography market is anticipated to reach $9.5 billion by 2029, with more than 80% of revenues from the market coming from web browsers, the IoT, machine tools, and the cybersecurity industry.

While quantum computing could pose a substantial threat to enterprises down the line, there are a wide range of solution providers emerging who are developing state-of-the-art post-quantum cryptographic solutions to mitigate this.

One such provider is UK-based post-quantum provider PQShield, which offers a range of quantum-secure solutions from IoT firm to PKI mobile and server technologies, as well as end-user applications.

Some of PQShields most recently developments include researchers and engineers contributing to the NIST Post-Quantum Cryptography Standardization Process, and the company recently raising $20 million as part of a Series A funding round.

Another promising provider is Crypta Labs, which raised 5.5 million ($7.4 million USD) in seed funding in 2020, and recently developed the worlds first space compliant Quantum Random Number Generator, which will be used to securely encrypt satellite data.

Post-Quantum itself is also in a strong position, with its encryption algorithm NTS-KEM becoming the only code-based finalist in the NIST process to identify a cryptographic standard to replace RSA and Elliptic Curve for PKC in the post-quantum world.

In any case, the wave of providers developing state of the art cryptographic algorithms means there are plenty of solutions for enterprises to deploy to mitigate the risk of quantum computing, now and in the future, to ensure that their private data stays protected.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More

Link:
NATO and White House recognize post-quantum threats and prepare for Y2Q - VentureBeat