Archive for the ‘Quantum Computer’ Category

Want to visit 2 million stars? Theres a shortcut, but youll need a warp drive – SYFY WIRE

Its towel day. Really. Someone has actually come up with a real hitchhikers guide to the galaxy, though there are no directions to the Restaurant at the End of the Universe.

The shortest path to take in order to visit 2 million stars has been found by mathematicians William Cook and Keld Helsgaun. They have finally solved a long-standing mystery. By analyzing data from the Gaia space telescope thatincluded 2,079,471 stars in our galaxy, they mapped the most efficient 3D route to hop over to each of them once, and the margin for error is extremely small. The only problem is that youd need a warp drive that could travel at the speed of light (at least), and were definitely not there yet. That, and you would probably have to be immortal or close to it, as the journey would still take about 100 million years.

Cook and Helsgaun were after the solution to the traveling salesman problem, which asks how you could take the shortest route between multiple destinations with only one stop at each. The journey would eventually bring you back to where you started from, which, in this scenario, is the sun. Helsgaun searched for ways to find better and better tours, while Cook was the one who proved guarantees on how short a tour could possibly be. The tours inform the guarantees while the guarantees help improve the tours. Figuring out how to do this with the stars in our galaxy puts it on the largest scale ever.

To make the rotating maps of the tours, we used the three.js Javascript 3D library, Cook told SYFY WIRE. The 3D positions of the stars were taken from a data set created at the Max Planck Institute for Astronomy. We used somewhere between 100 and 200 CPU years in the computations, running on a network of computers when they were not otherwise occupied.

The problem took up to 200 years of computing time that was compressed into just two years (it is faster when you dont have thousands of people trying to log onto the same network). Quantum computers could potentially speed up that process, but Cook has doubts. If someone could come up with technology advanced enough for a huge and extremely strong quantum computer, it could help with finding shorter tours, and quantum computing search could further help in shortening those routes. The problem is that were not technologically there yet. The quantum computers that do exist are just unable to support such an extreme dataset, let alone dream up every possible tour at once.

It is not at all clear that a quantum computer can help in solving large instances of the traveling salesman problem, Cook said. In particular, there is no indication that quantum computing can help substantially in finding a guarantee needed to prove a tour is shortest possible.

Gaia, whose mission is to make the largest and most precise 3D map of the Milky Way, has released data on the locations of 1.33 billion stars, so Cook and Helsgaun are now trying to figure out the shortest route between them. This is a dataset 500 times larger than the last. Their best result yet is over 15 trillion light years. So far, they can only guarantee that it is at most about a factor of 1.0038 longer than the shortest possible tour, which seems like nothing, but is a far greater margin for error than the factor of 0.0000074, which is 700 light years for that particular route. Not bad compared to the nearly hundred thousand the entire trip would take. Even then, Cook still wants to push it further.

We have found a set of rules (using parallel computing) that we hope will give a strong guarantee, but the hugescale of the problem makes it difficult to find thecombination of the rules that we need, he said. Thiscombinationtask is calledlinear programming it is the workhorse for the field of mathematical optimization. Our 1.33-billon star project is driving thecreation of LPalgorithms to handle examples nearly 1,000 times larger than waspreviously possible.

By the way, because Cook and Helsgaun believe that the 2 million-star tour could be done in even less time than the guarantee, they are offering a reward* of $50 for each parsec (3.26 light years) that can be saved by rearranging the route to those 2,079,471 stars, up to a $10,000 total. Just saying.

*It's legit. Cook personally asked your friendly neighborhood writer to spread the word about this.

Go here to read the rest:
Want to visit 2 million stars? Theres a shortcut, but youll need a warp drive - SYFY WIRE

House Democrats introduce bill to invest $900 billion in STEM research and education | TheHill – The Hill

Rep. Ro KhannaRohit (Ro) KhannaHouse Democrats introduce bill to invest 0 billion in STEM research and education Biden says he opposes Supreme Court term limits Dozens of legal experts throw weight behind Supreme Court term limit bill MORE (D-Calif.) and several other House Democrats introduced legislation on Tuesday to invest in and train a technologically proficient workforce for the future.

The 21st Century Jobs Act would invest $900 billion over ten years in research and development efforts around emerging technologies including artificial intelligence (AI), cybersecurity and biotechnology, along with prioritizing science, technology, engineering and mathematics (STEM) education.

It would establish a Federal Institute of Technology (FIT) that would be spread out across the nation at 30 different locations including existing educational facilities, along with promoting STEM education in public schools.

Specifically, the bill would help fund computer science courses for K-12 students, carve out scholarships for those pursuing degrees in the STEM fields, allocate $8 billion to train teachers in STEM fields, and create tax incentives for companies to hire individuals who attended a FIT institution orreceived a STEM scholarship in order to diversify the talent field.

According to a summary of the bill, it would ultimately create around3 million new jobs per year, and significantly raise public investment in research and development, helping the U.S. keep pace with other nations on the international stage.

The bill is also sponsored by DemocraticReps. Nanette Barragn (Calif.), Suzan DelBeneSuzan Kay DelBeneHouse Democrats introduce bill to invest 0 billion in STEM research and education Democrats sense momentum for expanding child tax credit Democrats say affordable housing would be a top priority in a Biden administration MORE (Wash.), Dwight EvansDwight (Dewey) EvansHouse Democrats introduce bill to invest 0 billion in STEM research and education Will the next coronavirus relief package leave essential workers behind? Bipartisan GROCER Act would give tax break to frontline workers MORE (Penn.), Jim HimesJames (Jim) Andres HimesHouse Democrats introduce bill to invest 0 billion in STEM research and education Overnight Defense: Pentagon IG to audit use of COVID-19 funds on contractors | Dems optimistic on blocking Trump's Germany withdrawal | Obama slams Trump on foreign policy House panel urges intelligence community to step up science and technology efforts MORE (Conn.), Pramila JayapalPramila JayapalHouse Democrats introduce bill to invest 0 billion in STEM research and education Ocasio-Cortez, progressives call on Senate not to confirm lobbyists or executives to future administration posts Pocan won't seek another term as Progressive Caucus co-chair MORE (Wash.) Tim RyanTimothy (Tim) RyanHouse Democrats introduce bill to invest 0 billion in STEM research and education Now's the time to make 'Social Emotional Learning' a national priority Mourners gather outside Supreme Court after passing of Ruth Bader Ginsburg MORE (Ohio) and Darren SotoDarren Michael SotoHouse Democrats introduce bill to invest 0 billion in STEM research and education Radiation elevated at fracking sites, researchers find Hopes for DC, Puerto Rico statehood rise MORE (Fla.), as well as House Homeland Security Committee Chairman Bennie ThompsonBennie Gordon ThompsonHouse Democrats introduce bill to invest 0 billion in STEM research and education Long-shot Espy campaign sees national boost in weeks before election House chairman asks Secret Service for briefing on COVID-19 safeguards for agents MORE (D-Miss.).

Several former Democratic tech-related officials endorsed the bill on Tuesday, including former Vice President Joe BidenJoe BidenGiuliani goes off on Fox Business host after she compares him to Christopher Steele Trump looks to shore up support in Nebraska Jeff Daniels narrates new Biden campaign ad for Michigan MOREs former chief economist Jared Bernstein, who said in a statement that weve got tremendous international catch-up to do in this space, and this proposal is the first Ive seen thats scaled to the magnitude of the challenge.

Ro Khannas 21st Century Jobs Package is advancing an important, ambitious agenda that would both increase economic growth and also help more people benefit from that growth, Jason FurmanJason FurmanHouse Democrats introduce bill to invest 0 billion in STEM research and education On The Money: Five things to know about the August jobs report Dates and developments to watch as we enter the home stretch MORE, a professor of the Practice of Economic Policy at Harvard University and the former chair of the Council of Economic Advisers during the Obama administration, said in a separate statement.

Khannas proposal would unleash the largest race to the top in American history as areas around the country compete not to provide tax benefits for private companies but instead to improve education, infrastructure, housing, and the climate for local innovation and development, Furman added.

Investment in developing technologies and in STEM education and workforce has been a rare topic of bipartisan support on Capitol Hill. Sens. Jacky RosenJacklyn (Jacky) Sheryl RosenHouse Democrats introduce bill to invest 0 billion in STEM research and education Hillicon Valley: Productivity, fatigue, cybersecurity emerge as top concerns amid pandemic | Facebook critics launch alternative oversight board | Google to temporarily bar election ads after polls close Lawmakers introduce legislation to boost cybersecurity of local governments, small businesses MORE (D-Nev.) and Cindy Hyde-Smith (R-Miss.)introduced legislation in September to provide $50 million to help small and medium-sized businesses hire and train professionals in the STEM field, particularly those who are female, Black or Latino or from rural areas.

A bipartisan group of senators led by Senate Minority Leader Chuck SchumerChuck SchumerHouse Democrats introduce bill to invest 0 billion in STEM research and education Graham dismisses criticism from Fox Business's Lou Dobbs Lewandowski: Trump 'wants to see every Republican reelected regardless of ... if they break with the president' MORE (D-N.Y.) introduced a separate bill in May that would funnel $100 billion over five years into U.S. science and technology research.

The Trump administration has also zeroed in on promoting investment in emerging science and technology fields.

The U.S. and the United Kingdom signed a formal agreement last month to promote cooperation on AI development, while the administration announced in August it would funnel over $1 billion over the next five years into funding new research institutes focused on AI and quantum computing development.

Read more here:
House Democrats introduce bill to invest $900 billion in STEM research and education | TheHill - The Hill

Why AI Geniuses Haven’t Created True Thinking Machines – Walter Bradley Center for Natural and Artificial Intelligence

As we saw yesterday, artificial intelligence (AI) has enjoyed a a string of unbroken successes against humans. But these are successes in games where the map is the territory. Therefore, everything is computable.

That fact hints at the problem tech philosopher and futurist George Gilder raises in Gaming AI (free download here). Whether all human activities can be treated that way successfully is an entirely different question. As Gilder puts it, AI is a system built on the foundations of computer logic, and when Silicon Valleys AI theorists push the logic of their case to a singularity, they defy the most crucial findings of twentieth-century mathematics and computer science.

Here is one of the crucial findings they defy (or ignore): Philosopher Charles Sanders Peirce (18391914) pointed out that, generally, mental activity comes in threes, not twos (so he called it triadic). For example, you see a row of eggs in a carton and think 12. You connect the objects (eggs) with a symbol, 12.

In Peirces terms, you are the interpretant, the one for whom the symbol 12 means something. But eggs are not 12. 12 is not eggs. Your interpretation is the third factor that makes 12 mean something with respect to the eggs.

Gilder reminds us that, in such a case, the map is not the territory (p. 37) Just as 12 is not the eggs, a map of California is not California. To mean anything at all, the map must be read by an interpreter. AI supremacy assumes that the machines map can somehow be big enough to stand in for the reality of California and eliminate the need for an interpreter.

The problem, he says, is that the map is not and never can be reality. There is always a gap:

Denying the interpretant does not remove the gap. It remains intractably present. If the inexorable uncertainty, complexity, and information overflows of the gap are not consciously recognized and transcended, the gap fills up with noise. Congesting the gap are surreptitious assumptions, ideology, bias, manipulation, and static. AI triumphalism allows it to sink into a chaos of constantly changing but insidiously tacit interpretations.

Ultimately AI assumes a single interpretant created by machine learning as it processes ever more zettabytes of data and converges on a single interpretation. This interpretation is always of a rearview mirror. Artificial intelligence is based on an unfathomably complex and voluminous look at the past. But this look is always a compound of slightly wrong measurements, thus multiplying its errors through the cosmos. In the real world, by contrast, where interpretation is decentralized among many individual mindseach person interpreting each symbolmistakes are limited, subject to ongoing checks and balances, rather than being inexorably perpetuated onward.

Does this limitation make a difference in practice? It helps account for the ongoing failure of Big Data to provide consistently meaningful correlations in science, medicine, or economics research. Economics professor Gary Smith puts the problem this way:

Humans naturally assume that all patterns are significant. But AI cannot grasp the meaning of any pattern, significant or not. Thus, from massive number crunches, we may learn (if thats the right word) that

Stock prices can be predicted from Google searches for the word debt.

Stock prices can be predicted from the number of Twitter tweets that use calm words.

An unborn babys sex can be predicted by the amount of breakfast cereal the mother eats.

Bitcoin prices can be predicted from stock returns in the paperboard-containers-and-boxes industry.

Interest rates can be predicted from Trump tweets containing the words billion and great.

If the significance of those patterns makes no sense to you, its not because you are not as smart as the Big Data machine. Those patterns shouldnt make any sense to you. Theres no sense in them because they are meaningless.

Smith, author with Jay Cordes of The Phantom Pattern Problem (Oxford, 2020), explains that these phantom patterns are a natural occurrence within the huge amounts of data that big computers crunch:

even random data contain patterns. Thus the patterns that AI algorithms discover may well be meaningless. Our seduction by patterns underlies the publication of nonsense in good peer-reviewed journals.

Yes, such meaningless findings from Big Data do creep into science and medicine journals. Thats partly a function of thinking that a big computer can do our thinking for us even though it cant recognize the meaning of patterns. Its what happens when there is no interpreter.

Ah, butso we are toldquantum computers will evolve so as to save the dream of true thinking machines. Gilder has thought about that one too. In fact, hes been thinking about it since 1989 when he published Microcosm: The Quantum Era in Economics and Technology.

Its true that, in the unimaginably tiny quantum world, electrons can do things we cant:

A long-ago thought experiment of Einsteins showed that once any two photonsor other quantum entitiesinteract, they remain in each others influence no matter how far they travel across the universe (as long as they do not interact with something else). Schrdinger christened this entanglement: The spinor other quantum attributeof one behaves as if it reacts to what happens to the other, even when the two are impossibly remote.

But, he says, its also true that continuously observing a quantum system will immobilize it (the quantum Zeno effect). As John Wheeler reminded us, we live in a participatory universe where the observer (Peirces interpretant) is critical. So quantum computers, however cool they sound, still play by rules where the interpreter matters.

In any event, at the quantum scale, we are trying to measure atoms and electrons using instruments composed of atoms and electrons (p. 41). That is self-referential and introduces uncertainty into everything: With quantum computing, you still face the problem of creating an analog machine that does not accumulate errors as it processes its data (p. 42). Now we are back where we started: Making the picture within the machine much bigger and more detailed will not make it identical to the reality it is supposed to interpret correctly.

And remember, we still have no idea how to make the Ultimate Smart Machine conscious because we dont know what consciousness is. We do know one thing for sure now: If Peirce is right, we could turn most of the known universe into processors and still not produce an interpreter (the consciousness that understands meaning).

Robert J. Marks points out that human creativity is non-algorithmic and therefore uncomputable. From which Gilder concludes, The test of the new global ganglia of computers and cables, worldwide webs of glass and light and air, is how readily they take advantage of unexpected contributions from free human minds in all their creativity and diversity. These high-entropy phenomena cannot even be readily measured by the metrics of computer science (p. 46).

Its not clear to Gilder that the AI geniuses of Silicon Valley are taking this in. The next Big Fix is always just around the corner and the Big Hype is always at hand.

Meanwhile, the rest of us can ponder an idea from technology philosopher George Dyson, Complex networksof molecules, people or ideasconstitute their own simplest behavioral descriptions. (p. 53) He was explaining why analog quantum computers would work better than digital ones. But, considered carefully, his idea also means that you are ultimately the best definition of you. And thats not something that a Big Fix can just get around.

Heres the earlier article: Why AI geniuses think they can create true thinking machines. Early on, it seemed like a string of unbroken successes In Gaming AI, George Gilder recounts the dizzying achievements that stoked the ambitionand the hidden fatal flaw.

See the original post here:
Why AI Geniuses Haven't Created True Thinking Machines - Walter Bradley Center for Natural and Artificial Intelligence

Every Thing You Need to Know About Quantum Computers – Analytics Insight

Quantum computersare machines that use the properties of quantum physics to store data and perform calculations based on the probability of an objects state before it is measured. This can be extremely advantageous for certain tasks where they could vastlyoutperform even the best supercomputers.

Quantum computers canprocess massive and complex datasetsmore efficiently than classical computers. They use the fundamentals of quantum mechanics to speed up the process of solving complex calculations. Often, these computations incorporate a seemingly unlimited number of variables and the potential applications span industries from genomics to finance.

Classic computers, which include smartphones and laptops, carry out logical operations using the definite position of a physical state. They encode information in binary bits that can either be 0s or 1s. In quantum computing, operations instead use the quantum state of an object to produce the basic unit of memory called as a quantum bit or qubit. Qubits are made using physical systems, such as the spin of an electron or the orientation of a photon. These systems can be in many different arrangements all at once, a property known as quantum superposition. Qubits can also be inextricably linked together using a phenomenon called quantum entanglement. The result is that a series of qubits can represent different things simultaneously. These states are the undefined properties of an object before theyve been detected, such as the spin of an electron or the polarization of a photon.

Instead of having a clear position, unmeasured quantum states occur in a mixed superposition that can be entangled with those of other objects as their final outcomes will be mathematically related even. The complex mathematics behind these unsettled states of entangled spinning coins can be plugged into special algorithms to make short work of problems that would take a classical computer a long time to work out.

American physicist andNobel laureate Richard Feynmangave a note about quantum computers as early as 1959. He stated that when electronic components begin to reach microscopic scales, effects predicted by quantum mechanics occur, which might be exploited in the design of more powerful computers.

During the 1980s and 1990s, the theory of quantum computers advanced considerably beyond Feynmans early speculation. In 1985,David Deutschof the University of Oxford described the construction of quantum logic gates for a universal quantum computer.Peter Shor of AT&T devised an algorithmto factor numbers with a quantum computer that would require as few as six qubits in 1994. Later in 1998, Isaac Chuang of Los Alamos National Laboratory, Neil Gershenfeld of Massachusetts Institute of Technology (MIT) and Mark Kubince of the University of Californiacreated the first quantum computerwith 2 qubits, that could be loaded with data and output a solution.

Recently, Physicist David Wineland and his colleagues at the US National Institute for Standards and Technology (NIST) announced that they havecreated a 4-qubit quantum computerby entangling four ionized beryllium atoms using an electromagnetic trap. Today, quantum computing ispoised to upend entire industriesstarting from telecommunications to cybersecurity, advanced manufacturing, finance medicine and beyond.

There are three primary types of quantum computing. Each type differs by the amount of processing power (qubits) needed and the number of possible applications, as well as the time required to become commercially viable.

Quantum annealing is best for solving optimization problems. Researchers are trying to find the best and most efficient possible configuration among many possible combinations of variables.

Volkswagen recently conducted a quantum experiment to optimize traffic flows in the overcrowded city of Beijing, China. The experiment was run in partnership with Google and D-Wave Systems. Canadian company D-Wave developed quantum annealer. But, it is difficult to tell whether it actually has any real quantumness so far. The algorithm could successfully reduce traffic by choosing the ideal path for each vehicle.

Quantum simulations explore specific problems in quantum physics that are beyond the capacity of classical systems. Simulating complex quantum phenomena could be one of the most important applications of quantum computing. One area that is particularly promising for simulation is modeling the effect of a chemical stimulation on a large number of subatomic particles also known as quantum chemistry.

Universal quantum computers are the most powerful and most generally applicable, but also the hardest to build. Remarkably, a universal quantum computer would likely make use of over 100,000 qubits and some estimates put it at 1M qubits. But to the disappointment, the most qubits we can access now is just 128. The basic idea behind the universal quantum computer is that you could direct the machine at any massively complex computation and get a quick solution. This includes solving the aforementioned annealing equations, simulating quantum phenomena, and more.

See original here:
Every Thing You Need to Know About Quantum Computers - Analytics Insight

Quantum Computing and the Cryptography Conundrum – CXOToday.com

By: Anand Patil

On October 23, 2019, researchers from Google made an official announcement of a major breakthrough one that scientists compared to the Wright Brothers first flight, or even mans first moon landing. They said to have achieved Quantum Supremacy, meaning that they had created a Quantum Computer that could perform a calculation that is considered impossible by the classical computers of today. The announcement was a landmark, highlighting the possibilities of Quantum Computing.

The concept of Quantum Computing itself isnt new. It is a field that has been a point of interest of physicists and computer researchers since the 1980s. Googles announcement, however, has brought it to the mainstream, and shone a spotlight on the promise that this niche field of innovation holds. Of course, like someone once said, with great power comes with great responsibility, so this field isnt without complexities.

The Possibilities of Quantum Computing

Quantum Computing is a branch of computer science that is focused on leveraging the principles of quantum physics to develop computer technology. Quantum Computers hold the promise to power major advances in various fields that require complex calculations from materials science and pharmaceuticals to aerospace and artificial intelligence (AI).

So far, Quantum Computers have been nothing more than fancy laboratory experiments large and expensive but they have successfully demonstrated that the underlying principles are sound and have the potential to transform industries and accelerate innovation like never before. This has spurred scientific and industrial interest in this nascent field, giving rise to multiple projects across the world in pursuit of creating a viable, general-use Quantum Computer. That said, it may still be many years before Quantum Computers are commercially and generally available.

So Why Does It Matter Today?The possibility of Quantum Computers poses a serious challenge to cryptographic algorithms deployed widely today. Todays key-exchange algorithms, like RSA, Diffie-Hellman, and others, rely on very difficult mathematical problems such as prime factorization for their security, which a Quantum computer would be able to solve much faster than a classical computer.

For example, it would take a classical computer centuries or even longer, to break modern algorithms like DH, RSA-2048 etc. by using brute-force methods. However, given the power and efficiency of quantum machines in calculations such as finding prime factors of large numbers it may be possible for a quantum computer to break current asymmetric algorithms in a matter of days

So, while the encrypted internet is not at risk at the moment, all that a bad actor has to do is capture the encrypted data today including the initial key exchange, and then wait until a powerful enough quantum computer is available to decrypt it. This is particularly a problem for organizations that have large amounts of sensitive data that they need to protect over the long term such as Banks, Governments and Defense agencies.

What Can I Do Now?For organizations that could be at risk in the future, this is the best time to start evaluating post-quantum cryptography. Simply put, this means moving to algorithms and/or keys that are a lot more robust and can withstand a brute-force attack by a quantum computer i.e. quantum resistant.

The National Institute of Standards and Technology (NIST) in the US is leading the effort towards the standardization of post-quantum secure algorithms. However, given the lengthy process involved, this may take many years to fructify.

An alternative is to use Quantum Key Distribution (QKD) techniques with existing algorithms that are considered quantum-safe. This involves using a dedicated optical channel to exchange keys using the quantum properties of photons. Any attempt to tap this secure channel will lead to a change in the quantum state of the photon and can be immediately detected and therefore the key is unhackable. One of the limitations of QKD in this method is the need for a dedicated optical channel that cannot span more than 50km between the two terminals. Of course, this also means that the existing encryption devices or routers should be capable of ingesting such Quantum-Generated keys.

Post-Quantum Cryptography and CiscoCisco is an active contributor to the efforts to standardize post-quantum algorithms. However, recognizing that an implementable standard may be some years away, there is work ongoing to ensure that organizations are able to implement quantum-resistant encryption techniques in the interim, that leverage existing network devices like routers which are most commonly used as encryptors.

To start with, a team of veteran technical leaders and cryptography experts from Cisco US David McGrew, Scott Fluhrer, Lionel Florit and the engineering team in Cisco India lead by Amjad Inamdar and Ramas Rangaswamy developed an API interface called the Secure Key Import Protocol or SKIP through which Cisco routers can securely ingest keys from an external post-quantum key source. This allows existing Cisco routers to be quantum-ready, with just the addition of an external QKD system. Going forward, this team is working on a way to deliver quantum-safe encryption keys without the need for short-range point-to-point connections.

The advantage of this method is that organizations can integrate post-quantum key sources with existing networking gear in a modular fashion without the need to replace anything already installed. In this manner, you could create a quantum-ready network for all traffic with minimal effort.

Getting Ready for the Post-Quantum WorldQuantum Supremacy is an event which demonstrates that a quantum machine is able to solve a problem that no classical computer can solve in a feasible amount of time. This race has gathered momentum in the recent past with several companies joining the bandwagon, and some even claiming to have achieved it.

There is an unprecedented amount of attention focused on making a commercially viable quantum computer. Many believe it is inevitable, and only a question of time. When it does happen, the currently used cryptography techniques will become vulnerable, and therefore be limited in their security. The good news is, there are methods available to adopt strong encryption techniques that will remain secure even after quantum computers are generally available.

If you are an organization that wants to protect its sensitive data over the long term, you should start to evaluate post-quantum secure encryption techniques today. By leveraging existing networking infrastructure and adding suitable post-quantum key distribution techniques, it is possible to take a quantum leap in securing your data.

(The author is Director, Systems Engineering, Cisco India and SAARC and the views expressed in this article are his own)

Read the original post:
Quantum Computing and the Cryptography Conundrum - CXOToday.com