Archive for the ‘Quantum Computing’ Category

Quantum computing breakthrough achieved, road to the future begins now – TweakTown

A team of researchers has achieved what is being described as a "breakthrough" in quantum computing.

VIEW GALLERY - 2 IMAGES

The achievement comes from a team of researchers at the RIKEN Center for Emergent Matter Science, who have been able to entangle a three-qubit array in silicon with high accuracy of predicting the state the qubit is in. For those that don't know, instead of using bits to make calculations and perform tasks like a typical computer does, quantum computers use quantum bits, or qubits.

The device the researchers created used three very small blobs of silicon called quantum dots, and each of these dots can hold one electron. The direction of the spin of the electron encodes the qubit. With that in mind, it should be noted that a "Two-qubit operation is good enough to perform fundamental logical calculations. But a three-qubit system is the minimum unit for scaling up and implementing error correction", explains Tarucha.

False-colored scanning electron micrograph of the device. The purple and green structures represent the aluminum gates, per scitechdaily.com.

After successfully entangling two qubits, the team of researchers introduced the third qubit and was able to predict its state with a high fidelity of 88%. Tarucha added, "We plan to demonstrate primitive error correction using the three-qubit device and to fabricate devices with ten or more qubits. We then plan to develop 50 to 100 qubits and implement more sophisticated error-correction protocols, paving the way to a large-scale quantum computer within a decade."

For more information on this story, check out this link here.

Read the original post:
Quantum computing breakthrough achieved, road to the future begins now - TweakTown

Large-Scale Simulations Of The Brain May Need To Wait For Quantum Computers – Forbes

Will quantum computer simulations crack open our understanding of the biological brain?

Looking back at the history of computers, its hard to overestimate the rate at which computing power has scaled in the course of just a single human lifetime. But yet, existing classical computers have fundamental limits. If quantum computers are successfully built and eventually fully come online, they will be able to tackle certain classes of problems that elude classical computers. And they may be the computational tool needed to fully understand and simulate the brain.

As of this writing, the fastest supercomputer in the world is Japans Fugaku supercomputer, developed jointly by Riken and Fujitsu. It can perform 442 peta-floating-point operations per second.

Lets break that number down in order to arrive at an intuitive (as much as possible) grasp of what it means.

A floating-point number is a way to express, or write down, a real number - real in a mathematical sense - with a fixed amount of precision. Real numbers are all the continuous numbers from the number line. 5, -23, 7/8, and numbers like pi (3.1415926 ...) that go on forever are all real numbers. The problem is a computer, which is digital, has a hard time internally representing continuous numbers. So one way around this is to specify a limited number of digits, and then specify how big or small the actual number is by some base power. For example, the number 234 can be written as 2.34 x 102, because 2.34 x 100 equals 234. Floating point numbers specify a fixed number of significant digits the computer must store in its memory. It fixes the accuracy of the number. This is important because if you do any mathematical operation (e.g. addition, subtraction, division or multiplication) with the fixed accuracy version of a real number, small errors in your results will be generated that propagate (and can grow) throughout other calculations. But as long as the errors remain small its okay.

A floating point operation then, is any arithmetic operation between two floating-point numbers (abbreviated as FLOP). Computer scientists and engineers use the number of FLOP per second - or FLOPS - as a benchmark to compare the speed and computing power of different computers.

One petaFLOP is equivalent to 1,000,000,000,000,000 - or one quadrillion - mathematical operations. A supercomputer with a computing speed of one petaFLOPS is therefore performing one quadrillion operations per second! The Fugaku supercomputer is 442 times faster than that.

For many types of important scientific and technological problems however, even the fastest supercomputer isnt fast enough. In fact, they never will be. This is because for certain classes of problems, the number of possible combinations of solutions that need to be checked grow so fast, compared to the number of things that need to be ordered, that it becomes essentially impossible to compute and check them all.

Heres a version of a classic example. Say you have a group of people with differing political views, and you want to seat them around a table in order to maximize constructive dialogue while minimizing potential conflict. The rules you decide to use dont matter here, just that some set of rules exist. For example, maybe you always want to seat a moderate between a conservative and a liberal in order to act as a bit of a buffer.

This is what scientists and engineers call an optimization problem. How many possible combinations of seating arrangements are there? Well, if you only have two people, there are only two possible arrangements. One individual on each side of a table, and then the reverse, where the two individuals change seats. But if you have five people, the number of possible combinations jumps to 120. Ten people? Well, now youre looking at 3,628,800 different combinations. And thats just for ten people, or more generally, any ten objects. If you had 100 objects, the number of combinations is so huge that its a number with 158 digits (roughly, 9 x 10157). By comparison, there are only about 1021 stars in the observable universe.

Imagine now if you were trying to do a biophysics simulation of a protein in order to develop a new drug that had millions or billions of individual molecules interacting with each other. The number of possible combinations that would need to be computed and checked far exceed the capability of any computer that exists today. Because of how theyre designed, even the fastest supercomputer is forced to check each combination sequentially - one after another. No matter how fast a classical computer is or can be, given the literally greater than astronomical sizes of the number of combinations, many of these problems would take a practical infinity to solve. It just becomes impossible.

Related, the other problem classical computers face is its impossible to build one with sufficient memory to store each of the combinations, even if all the combinations could be computed.

The details of how a quantum computer and quantum computing algorithms work is well beyond the scope or intent of this article, but we can briefly introduce one of the key ideas in order to understand how they can overcome the combinatorial limitations of classical computers.

Classical computers represent information - all information - as numbers. And all numbers can be represented as absolute binary combinations of 1s and 0s. The 1 and 0 each represent a bit of information, the fundamental unit of classical information. Or put another way, information is represented by combinations of two possible states. For example, the number 24 in binary notation is 11000. The number 13 is 1101. You can also do all arithmetic in binary as well. This is convenient, because physically, at the very heart of classical computers is the transistor, which is just an on-off electrical switch. When its on it encodes a 1, and when its off it encodes a 0. Computers do all their math by combining billions of tiny transistors that very quickly switch back and forth as needed. Yet, as fast as this can occur, it still takes finite amounts of time, and all calculations need to be done in an appropriate ordered sequence. If the number of necessary calculations become big enough, as is the case with the combinatorial problems discussed above, you run into an unfeasible computational wall.

Quantum computers are fundamentally different. They overcome the classical limitations by being able to represent information internally not just as a function of two discrete states, but as a continuous probabilistic mixing of states. This allows quantum bits, or qubits, to have many more possible states they can represent at once, and so many more possible combinations of arrangements of objects at once. Put another way, the state space and computational space that a quantum computer has access too is much larger than that of a classical computer. And because of the wave nature of quantum mechanics and superposition (concepts we will not explore here), the internal mixing and probabilistic representation of states and information eventually converge to one dominant solution that the computer outputs. You cant actually observe that internal mixing, but you can observe the final computed output. In essence, as the number of qubits in the quantum computer increase, you can exponentially do more calculations in parallel.

The key concept here is not that quantum computers will necessarily be able to solve new and exotic classes of problems that classical computers cant - although computer scientists have discovered a theoretical class of problem that only quantum computers can solve - but rather that they will be able to solve classes of problems that are - and always will be - beyond the reach of classical computers.

And this isnt to say that quantum computers will replace classical computers. That is not likely to happen anytime in the foreseeable future. For most classes of computational problems classical computers will still work just fine and probably continue being the tool of choice. But for certain classes of problems, quantum computers will far exceed anything possible today.

Well, it depends on the scale at which the dynamics of the brain is being simulated. For sure, there has been much work within the field of computational neuroscience over many decades successfully carrying out computer simulations of the brain and brain activity. But its important to understand the scale at which any given simulation is done.

The brain is exceedingly structurally and functionally hierarchical - from genes, to molecules, cells, network of cells and networks of brain regions. Any simulation of the brain needs to begin with an appropriate mathematical model, a set of equations that capture the chosen scale being modeled that then specify a set of rules to simulate on a computer. Its like a map of a city. The mapmaker needs to make a decision about the scale of the map - how much detail to include and how much to ignore. Why? Because the structural and computational complexity of the brain is so vast and huge that its impossible given existing classical computers to carry out simulations that cut across the many scales with any significant amount of detail.

Even though a wide range of mathematical models about the molecular and cell biology and physiology exist across this huge structural and computational landscape, it is impossible to simulate with any accuracy because of the sheer size of the combinatorial space this landscape presents. It is the same class of problem as that of optimizing people with different political views around a table. But on a much larger scale.

Once again, it in part depends on how you choose to look at it. There is an exquisite amount of detail and structure to the brain across many scales of organization. Heres a more in depth article on this topic.

But if you just consider the number of cells that make up the brain and the number of connections between them as a proxy for the computational complexity - the combinatorial space - of the brain, then it is staggeringly large. In fact, it defies any intuitive grasp.

The brain is a massive network of densely interconnected cells consisting of about 171 trillion brain cells - 86 billion neurons, the main class of brain cell involved in information processing, and another 85 billion non-neuronal cells. There are approximately 10 quadrillion connections between neurons that is a 1 followed by 16 zeros. And of the 85 billion other non-neuronal cells in the brain, one major type of cell called astrocyte glial cells have the ability to both listen in and modulate neuronal signaling and information processing. Astrocytes form a massive network onto themselves, while also cross-talking with the network of neurons. So the brain actually has two distinct networks of cells. Each carrying out different physiological and communication functions, but at the same time overlapping and interacting with each other.

The computational size of the human brain in numbers.

On top of all that structure, there are billions upon billions upon billions of discrete electrical impulses, called action potentials, that act as messages between connected neurons. Astrocytes, unlike neurons, dont use electrical signals. They rely on a different form of biochemical signaling to communicate with each other and with neurons. So there is an entire other molecularly-based information signaling mechanism at play in the brain.

Somehow, in ways neuroscientists still do not fully understand, the interactions of all these electrical and chemical signals carry out all the computations that produce everything the brain is capable of.

Now pause for a moment, and think about the uncountable number of dynamic and ever changing combinations that the state of the brain can take on given this incredible complexity. Yet, it is this combinatorial space, the computations produced by trillions of signals and billions of cells in a hierarchy of networks, that result in everything your brain is capable of doing, learning, experiencing, and perceiving.

So any computer simulation of the brain is ultimately going to be very limited. At least on a classical computer.

How big and complete are the biggest simulations of the brain done to date? And how much impact have they had on scientists understanding of the brain? The answer critically depends on whats being simulated. In other words, at what scale - or scales - and with how much detail given the myriad of combinatorial processes. There certainly continue to be impressive attempts from various research groups around the world, but the amount of cells and brain being simulated, the level of detail, and the amount of time being simulated remains rather limited. This is why headlines and claims that tout ground-breaking large scale simulations of the brain can be misleading, sometimes resulting in controversy and backlash.

The challenges of doing large multi-scale simulations of the brain are significant. So in the end, the answer to how big and complete are the biggest simulations of the brain done to date and how much impact have they had on scientists understanding of the brain - is not much.

First, by their very nature, given a sufficient number of qubits quantum computers will excel at solving and optimizing very large combinatorial problems. Its an inherent consequence of the physics of quantum mechanics and the design of the computers.

Second, given the sheer size and computational complexity of the human brain, any attempt at a large multi-scale simulation with sufficient detail will have to contend with the combinatorial space of the problem.

Third, how a potential quantum computer neural simulation is set up might be able to take advantage of the physics the brain is subject to. Despite its computational power, the brain is still a physical object, and so physical constraints could be used to design and guide simulation rules (quantum computing algorithms) that are inherently combinatorial and parallelizable, thereby taking advantage of what quantum computers do best.

For example, local rules, such as the computational rules of individual neurons, can be used to calculate aspects of the emergent dynamics of networks of neurons in a decentralized way. Each neuron is doing their own thing and contributing to the larger whole, in this case the functions of the whole brain itself, all acting at the same time, and without realizing what theyre contributing too.

In the end, the goal will be to understand the emergent functions of the brain that give rise to cognitive properties. For example, large scale quantum computer simulations might discover latent (hidden) properties and states that are only observable at the whole brain scale, but not computable without a sufficient level of detail and simulation from the scales below it.

If these simulations and research are successful, one can only speculate about what as of yet unknown brain algorithms remain to be discovered and understood. Its possible that such future discoveries will have a significant impact on related topics such as artificial quantum neural networks, or on specially designed hardware that some day may challenge the boundaries of existing computational systems. For example, just published yesterday, an international team of scientists and engineers announced a computational hardware device composed of a molecular-chemical network capable of energy-efficient rapid reconfigurable states, somewhat similar to the reconfigurable nature of biological neurons.

One final comment regarding quantum computers and the brain: This discussion has focused on the potential use of future quantum computers to carry out simulations of the brain that are not currently possible. While some authors and researchers have proposed that neurons themselves might be tiny quantum computers, that is completely different and unrelated to the material here.

It may be that quantum computers will usher in a new era for neuroscience and the understanding of the brain. It may even be the only real way forward. But as of now, actually building workable quantum computers with sufficient stable qubits that outperform classical computers at even modest tasks remains a work in progress. While a handful of commercial efforts exist and have claimed various degrees of success, many difficult hardware and technological challenges remain. Some experts argue that quantum computers may in the end never be built due to technical reasons. But there is much research across the world both in academic labs and in industry attempting to overcome these engineering challenges. Neuroscientists will just have to be patient a bit longer.

Continue reading here:
Large-Scale Simulations Of The Brain May Need To Wait For Quantum Computers - Forbes

Quantum More Than Just Computing – Todayuknews – Todayuknews

Dr Najwa Sidqi, Knowledge Transfer Manager of Quantum Technologies at KTN, explains that, despite the media focus on computing, quantum technologies are far broader than you might think, and they are set to impact the world dramatically

Throughout history, there have been revolutionary technological innovations that have changed the way the world operates and quantum technology is set to be the next of these developments. While quantum computing is regularly discussed in the media, it is largely hogging the limelight thats right, the scope of quantum tech is far broader than just increasing computing power beyond anything that is currently available. With some of it very close to the market, its quite strange that we dont hear about all the other elements of quantum technology that are soon going to change our lives.

In recent years, the advancement of technology has been seen through our ability to shrink things down and get more processing power out of a smaller surface area. The problem is, there is a limit to how small we can go while we use electrons as our basic building block of computing (literally the difference between a 1 and a 0 to a computer). If, however, we were able to utilise smaller subatomic particles, such as photons, we could increase the power of our technology considerably.

But as weve learnt to manipulate and measure the energy of individual photons, weve come to realise that its applications go beyond simply boosting the processing power of our PCs. And thats why quantum technology is broader than quantum computing.

So, why does computing take up so much of the focus? Its simple really, the benefits of quantum computing are easy to get your head around and apply to just about every sector. All industries, from finance to construction and nuclear energy to farming, require at least some level of computing.

The other key reason is that its the big names in IT, Google, IBM and Microsoft, that are driving the development of quantum computing, each devoting huge amounts of resource to it and generating a lot of media interest too.

So, what are some other applications of quantum technology? Well, thats the exciting thing. The applications are enormous and could well be endless.

Right now, theres exciting work being done in quantum communication, which allows for infinitely more complex data encryption than what is currently available.

Quantum sensing is another incredible field of research and development that will take our ability to precisely measure electromagnetic waves, fields and forces so much further forward that its hard to comprehend the impact on scientific understanding.

Quantum imaging has the potential to revolutionise metrology in a number of fields, with applications in gas leak detection to non-invasive in vivo imaging in healthcare. So, how far off into the distant future are these technologies of tomorrow? Well, not too distant at all, in fact theyre already being commercialised.

Companies such as QLM Technology use a quantum gas imaging LIDAR to detect and monitor greenhouse gases. The photon-precise sensor allows organisations to effectively monitor and map the locations and flow rates of gas leaks with high-sensitivity imaging that shows plume shape and concentration.

Likewise, ID Quantique, based in Switzerland, is already leading the world in quantum-safe encryption solutions. Their products are in use by governments, enterprises and research labs across the world.

OK, yes, quantum computing is very exciting, but its not the only quantum technology thats going to improve our lives. There are exciting developments occurring throughout the field of quantum technology which deserve the same amount of attention, and theyre right around the corner!

If youre interested in quantum R&D, theUK National Quantum Technologies Showcaseis taking place on Friday 5th November in the Business Design Centre, London. It will bring together around 60 of the UKs most exciting projects from across the Quantum landscape. The event will also be streamed live for virtual attendees. Exhibitors can register nowhere and delegates will be able register in September, Id love to see you there.

See original here:
Quantum More Than Just Computing - Todayuknews - Todayuknews

Memory devices on satellites to enable the quantum internet – University of Strathclyde

The installation of memory and repeater devices in space, to enable use of the quantum internet, have been proposed in research by the University of Strathclyde and an international collaboration.

The study suggests that quantum memories (QM), which store information in quantum form, and repeaters, which are used in the transmission of the information, can be deployed to facilitate use of advanced internet technology. This is done through distribution of quantum entanglement, a phenomenon in which two particles are interlinked, potentially at vast distances from each other.

The research showed that satellites equipped with QMs provided entanglement distribution rates which were three orders of magnitude faster than those from fibre-based repeaters or space systems without QMs.

The study has been published in the journal npj Quantum Information. It was led by Humboldt University in Berlin and also involved the Institute of Optical Sensor Systems of the German Aerospace Center (DLR) and JPL (Jet Propulsion Laboratory NASA).

Dr Daniel Oi, Senior Lecturer in Strathclydes Department of Physics, a partner in the research, said: We show in this paper that this method would have much higher performance than previously proposed schemes and we identify promising physical systems with which to implement it.

The work is connected to wider work at Strathclyde on Quantum Technologies, and in particular Space Quantum Communication research that includes several space missions due to be launched in the next few years.

Global-scale quantum communication links will form the backbone of the quantum internet. Exponential loss in optical fibres means that there is no realistic application of this beyond a few hundred kilometres but quantum repeaters and space-based systems offer a solution to this limitation.

The proposal in the research uses satellites equipped with QMs in low-earth orbit. It is focused on the use of quantum key distribution (QKD) for encryption and distribution, and of QMs to synchronise detection events which could otherwise have been happening by chance.

The researchers describe their study as a roadmap to realise unconditionally secure quantum communications over global distances with near-term technologies.

The paper states: With the majority of optical links now in space, a major strength of our scheme is its increased robustness against atmospheric losses. We further demonstrate that QMs can enhance secret key rates in general line-of-sight QKD protocols.

AQuantum Technology Cluster is embedded in the Glasgow City Innovation District, an initiative driven by Strathclyde along with Glasgow City Council, Scottish Enterprise, Entrepreneurial Scotland and Glasgow Chamber of Commerce. It is envisaged as a global place for quantum industrialisation, attracting companies to co-locate, accelerate growth, improve productivity and access world-class research technology and talent at Strathclyde.

The University of Strathclyde is the only academic institution that has been a partner in all four EPSRC funded Quantum Technology Hubs in both phases of funding. The Hubs are in: Sensing and Timing; Quantum Enhanced Imaging; Quantum Computing and Simulation, and Quantum Communications Technologies.

Go here to see the original:
Memory devices on satellites to enable the quantum internet - University of Strathclyde

Future in the cloud for encryption – Capacity Media

06 September 2021 | Alan Burkitt-Gray

Traditional PKI methods of encrypting data are about to fall to the onslaught of quantum computing. Arqit, a start-up led by David Williams thinks it has a quantum-based solution, he tells Alan Burkitt-Gray

A start-up company that is expected to be valued at US$1.4 billion by the end of August is launching its quantum-based telecoms encryption service in the middle of July. Arqit, founded by satellite entrepreneur David Williams, is launching QuantumCloud, a platform-as-a-service (PaaS) for telecoms, including consumer, industrial and defence internet of things (IoT), he tells me.

Early customers, including BT and other telcos that he doesnt want to name, have already signed contracts and used the cyber security software, but Arqit is likely to be thrust into greater prominence imminently, when a Nasdaq-listed special purpose acquisition company (Spac) buys it in a deal that will value it at $1.4 billion.

Williams and a small number of co-founders will own 45%, he tells me a stake that will be worth $630 million to him and his colleagues.

A former banker, Williams, who is now chairman of Arqit, was founder and CEO of Avanti, a UK-based company that runs a fleet of geostationary satellites called Hylas with government, military and commercial customers. He left Avanti in August 2017 and a month later set up Arqit.

Being the founder of two satellite companies is a pretty remarkable record after seven years working for three banks following a degree in economics and politics. (He also notes that he was the yard-of-ale champion at the University of Leeds.)

However, his first start-up, Avanti Communications, has not fared well over the past year, long after Williamss departure. In February 2021 its existing junior lenders injected $30 million of new capital, and its so-called super senior facility, which was due for repayment in February, was extended, but only to the end of January 2022.

Existential threat

But Arqit has moved into a completely different market, addressing something the company calls an existential threat to the hyperconnected world. Why? The legacy encryption that we all use, designed in the 1980s, has done a great job but is now failing us, says Arqit on its website. It was never intended for use in our hyper-connected world. The breaches caused are seen around us daily.

At the same time, there is a bigger problem. Quantum computing now poses an existential threat to cyber security for everyone. As a result, the world must begin a global upgrade cycle to replace all encryption technologies, an upgrade unlike anything we have seen before, says the company.

Dont bother patching and mending, says Arqit. Dont take risks with incremental improvements to public key encryption which is no longer fit for purpose.

Encryption using public key infrastructure (PKI) emerged from the communications intelligence community around 1971 in work by James Ellis at the UKs Government Communications Headquarters (GCHQ) and was then developed further in 1976 through work in the US and Israel by Whit Diffie and Martin Hellman and separately by Ronald Rivest, Adi Shamir and Leonard Adleman (known, from their initials, as RSA).

So, the idea is virtually half a century old. But in that time, certainly in the past decade, it has done us well. If the URL of a website starts https://, you know its encrypted to those 1970s standards. It means we are reasonably confident we can type our credit card details into a hotel, theatre, travel or shopping site. Messaging apps such as Signal and WhatsApp use encryption based on these PKI principles.

No one trusts PKI

However, no one trusts PKI any more, says Williams. The safest way of delivering keys to a battlefield is now to put them on a dongle and fly them in by helicopter.

At the heart of the problem is the fact that quantum computers are coming, and quantum computers are fast. Diffie and Hellman, and the RSA trio, calculated that if it took weeks or months to decrypt a message, PKI was secure. Breaking the code would be computationally infeasible, to use the term the crypto community likes.

By perhaps as soon as next year, quantum computers will be able to work so fast that they will have decrypted the text in a usable period of time. The challenge will no longer be computationally infeasible. Someone intercepting a transaction could find your credit card details within an hour or so, and use them. So, thats why there is pressure to upgrade to a new system of key exchange, a replacement for PKI.

However, the security people have something more to worry about. Many suspect that for years governments and other organisations have been squirrelling away in their vaults traffic that is encrypted to current standards, knowing that, any time soon, they will be able to crack it.

Think of all those politicians, on all sides of the global political divides, who have been conspiring via WhatsApp. Think of all those whistleblowers who have leaked information to law enforcement authorities or journalists via Signal. Think of all those criminal organisations that have been using Telegram for their plans.

Lemon juice and milk

Thats why PKI, the current crypto infrastructure, is facing what Arqit calls an existential threat. Pretty soon, it will be as outmoded as writing Xf buubdl bu ebxo upnpsspx* in lemon juice or milk and sending it via carrier pigeon. Dont bother with minor fixes, says Arqit. Its wrong to patch and mend, or to take risks.

The future lies in symmetric keys, with a new way of distributing them. Symmetric keys are provably secure against any attack, including quantum computing, says the company.

The problem is that, until now, there has been no safe way to distribute them. Arqit says that it offers a method to create those keys at scale, securely, at any kind of endpoint device. We have invented a method of creating unbreakable encryption keys locally, both at the edge and in the cloud, says Williams.

Arqit has a solution. Its called Arq19, pretty much for the same reason Covid-19 has that suffix: 2019 was our Eureka moment, he smiles.

These are systems he calls global and trustless, a confusing term. It seems to mean you cant trust it, but what Williams and Arqit mean is that you dont have to trust it, as keys will never be stored in any system, so they cannot be stolen, but they can be put on devices within less than half a second to enable a high level of security.

We create hardware storage modules in a number of places he says London, New York, Sydney, for example. But those arent the keys. They are clues, a process involving shared secrets to create brand-new symmetrical encryption keys. No, I dont understand either; but how many people in 1936 understood Turings famous paper, On Computable Numbers, which started the computer revolution? (Turing went on to work during World War Two at GCHQs predecessor at Bletchley Park, in what is now the English city of Milton Keynes.)

Arqit can deliver its keys in unlimited group sizes, says Williams. The traditional PKI approach is for two-way communications Alice and Bob, in the crypto communitys terminology.

But what Williams is looking for is a system that will work with Alice, Bob, Catherine, Dave, Eve and a whole telephone directory.

For example, says Williams, they can deliver keys to international telecoms networks, and we can change the key every second if we want. He says that will result in ultra-secure software defined networks (SDNs).

We can deliver quantum keys in a manner thats global and trustless, says Williams. The company will use a small fleet of satellites, weighing 300kg each, that is being built by QinetiQ, a company formed 20 years ago by the privatisation of part of the UK governments Defence Evaluation and Research Agency.

BT has an exclusive deal to distribute Arqits QuantumCloud services in the UK, and the Japanese firm Sumitomo has a deal as the first big international customer, says Williams.

It is working with telcos to encrypt traffic on Japanese fibre cables, he adds.

These are contracts with distributors that have been signed, but the companys first contract with a corporate user went live in June, he says, although he will not name the partner, except that it is a big global corporation. It is an enterprise customer and is not BT.

The eventual market will include the internet of things (IoT) and connected cars, enterprise and connectivity, he said. Cost will be low, says Williams. Users will pay a tiny fraction of a dollar for each key created.

Heir to Turing

Williams has gathered around him a range of technical, crypto and management talent. CTO and co-founder with Williams is David Bestwick, who was also a co-founder and CTO of Avanti. Theres a chief cryptographer who was at GCHQ: think of David Shiu as the inheritor of the tradition founded by Turing 80 years ago.

There are other ex-GCHQ people, too, and a retired air vice-marshal and a former lieutenant general in the US Air Force. And more, including experts in telecoms, IT and a chief software engineer who was at McAfee. And a former head of operations at 10 Downing Street.

These people are well connected. Well see what they achieve.

Though, will we be able to find out, or will it all be encrypted?

*Xf buubdl bu ebxo upnpsspx means just We attack at dawn tomorrow, using the so-called Caesar cipher, as reputedly used by the Roman dictator

Read the original:
Future in the cloud for encryption - Capacity Media