Archive for the ‘Quantum Computing’ Category

Inside the weird, wild, and wondrous world of quantum video games – Digital Trends

IBM Research

In 1950, a man named John Bennett, an Australian employee of the now-defunct British technology firm Ferranti, created what may be historys first gaming computer. It could play a game called Nim, a long-forgotten parlor game in which players take turns removing matches from several piles. The player who loses is the one who removes the last match. For his computerized version, Bennett created a vast machine 12 feet wide, 5 feet tall, and 9 feet deep. The majority of this space was taken up by light-up vacuum tubes which depicted the virtual matches.

Bennetts aim wasnt to create a game-playing machine for the sake of it; the reason that somebody might build a games PC today. As writer Tristan Donovan observed in Replay, his superlative 2010 history of video games: Despite suggesting Ferranti create a game-playing computer, Bennetts aim was not to entertain but to show off the ability of computers to do [math].

Jump forward almost 70 years and a physicist and computer scientist named Dr. James Robin Wootton is using games to demonstrate the capabilities of another new, and equally large, experimental computer. The computer in this question is a quantum computer, a dream of scientists since the 1980s, now finally becoming a scientific reality.

Quantum computers encode information as delicate correlations with an incredibly rich structure. This allows for potentially mind-boggling densities of information to be stored and manipulated. Unlike a classical computer, which encodes as a series of ones and zeroes, the bits (called qubits) in a quantum computer can be either a one, a zero, or both at the same time. These qubits are composed of subatomic particles, which conform to the rules of quantum rather than classical mechanics. They play by their own rules a little bit like Tom Cruises character Maverick from Top Gun if he spent less time buzzing the tower and more time demonstrating properties like superpositions and entanglement.

I met Wootton at IBMs research lab in Zurich on a rainy day in late November. Moments prior, I had squeezed into a small room with a gaggle of other excited onlookers, where we stood behind a rope and stared at one of IBMs quantum computers like people waiting to be allowed into an exclusive nightclub. I was reminded of the way that people, in John Bennetts day, talked about the technological priesthood surrounding computers: then enormous mainframes sequestered away in labyrinthine chambers, tended to by highly qualified people in white lab coats. Lacking the necessary seminary training, we quantum computer visitors could only bask in its ambience from a distance, listening in reverent silence to the weird vee-oing vee-oing vee-oing sound of its cooling system.

Wottons interest in quantum gaming came about from exactly this scenario. In 2016, he attended a quantum computing event at the same Swiss ski resort where, in 1925, Erwin Schrdinger had worked out his famous Schrdinger wave equation while on vacation with a girlfriend. If there is a ground zero for quantum computing, this was it. Wotton was part of a consortium, sponsored by the Swiss government, to do (and help spread the word about) quantum computing.

At that time quantum computing seemed like it was something that was very far away, he told Digital Trends. Companies and universities were working on it, but it was a topic of research, rather than something that anyone on the street was likely to get their hands on. We were talking about how to address this.

Wootton has been a gamer since the early 1990s. I won a Game Boy in a competition in a wrestling magazine, he said. It was a Slush Puppy competition where you had to come up with a new flavor. My Slush Puppy flavor was called something like Rollin Redcurrant. Im not sure if you had to use the adjective. Maybe thats what set me apart.

While perhaps not a straight path, Wootton knew how an interest in gaming could lead people to an interest in other aspects of technology. He suggested that making games using quantum computing might be a good way of raising public awareness of the technology.He applied for support and, for the next year, was given to my amazement the chance to go and build an educational computer game about quantum computing. At the time, a few people warned me that this was not going to be good for my career, he said. [They told me] I should be writing papers and getting grants; not making games.

But the idea was too tantalizing to pass up.

That same year, IBM launched its Quantum Experience, an online platform granting the general public (at least those with a background in linear algebra) access to IBMs prototype quantum processors via the cloud. Combined with Project Q, a quantum SDK capable of running jobs on IBMs devices, this took care of both the hardware and software component of Woottons project. What he needed now was a game. Woottons first attempt at creating a quantum game for the public was a version of the game Rock-Paper-Scissors, named Cat-Box-Scissors after the famous Schrdingers cat thought experiment. Wootton later dismissed it as [not] very good Little more than a random number generator with a story.

But others followed. There was Battleships, his crack at the first multiplayer game made with a quantum computer. There was Quantum Solitaire. There was a text-based dungeon crawler, modeled on 1973s Hunt the Wumpus, called Hunt the Quantpus. Then the messily titled, but significant, Battleships with partial NOT gates, which Wootton considers the first true quantum computer game, rather than just an experiment. And so on. As games, these dont exactly make Red Dead Redemption 2 look like yesterdays news. Theyre more like Atari 2600 or Commodore 64 games in their aesthetics and gameplay. Still, thats exactly what youd expect from the embryonic phases of a new computing architecture.

If youd like to try out a quantum game for yourself, youre best off starting with Hello Quantum, available for both iOS and Android. It reimagines the principles of quantum computing as a puzzle game in which players must flip qubits. It wont make you a quantum expert overnight, but it will help demystify the process a bit. (With every level, players can hit a learn more button for a digestible tutorial on quantum basics.)

Quantum gaming isnt just about educational outreach, though. Just as John Bennett imagined Nim as a game that would exist to show off a computers abilities, only to unwittingly kickstart a $130 billion a year industry, so quantum games are moving beyond just teaching players lessons about quantum computing.Increasingly, Wootton is excited about what he sees as real world uses for quantum computing. One of the most promising of these is taking advantage of quantum computings random number generating to create random terrain within computer games. In Zurich, he showed me a three-dimensional virtual landscape reminiscent of Minecraft. However, while much of the world of Minecraft is user generated, in this case the blocky, low-resolution world was generated using a quantum computer.

Quantum mechanics is known for its randomness, so the easiest possibility is just to use quantum computing as a [random number generator], Wootton said. I have a game in which I use only one qubit: the smallest quantum computer you can get. All you can do is apply operations that change the probabilities of getting a zero or one as output. I use that to determine the height of the terrain at any point in the game map.

Plenty of games made with classical computers have already included procedurally generated elements over the years. But as the requirements for these elements ranging from randomly generated enemies to entire maps increase in complexity, quantum could help.

Gaming is an industry that is very dependent on how fast things run

Gaming is an industry that is very dependent on how fast things run, he continued. If theres a factor of 10 difference in how long it takes something to run that determines whether you can actually use it in a game. He sees today as a great jumping-on point for people in the gaming industry to get involved to help shape the future development of quantum computing. Its going to be driven by what people want, he explained. If people find an interesting use-case and everyone wants to use quantum computing for a game where you have to submit a job once per frame, that will help dictate the way that the technology is made.

Hes now reached the point where he thinks the race may truly be on to develop the first commercial game using a quantum computer. Weve been working on these proof-of-principle projects, but now I want to work with actual game studios on actual problems that they have, he continued. That means finding out what they want and how they want the technology to be [directed].

One thing thats for certain is that Wootton is no longer alone in developing his quantum games. In the last couple of years, a number ofquantum game jams have popped up around the world. What most people have done is to start small, Wootton said. They often take an existing game and use one or two qubits to help allow you to implement a quantum twist on the game mechanics. Following this mantra, enthusiasts have used quantum computing to make remixed versions of existing games, including Dr. Qubit (a quantum version of Dr. Mario), Quantum Cat-sweeper (a quantum version of Minesweeper), and Quantum Pong (a quantum version of, err, Pong).

The world of quantum gaming has moved beyond its 1950 equivalent of Nim. Now we just have to wait and see what happens next. The decades which followed Nim gave us MITs legendary Spacewar in the 1960s, the arcade boom of the 1970s and 80s, the console wars of Sega vs. Nintendo, the arrival of the Sony PlayStation in the 1990s, and so on. In the process, classical computers became part of our lives in a way they never were before. As Whole Earth Catalog founder Stewart Brand predicted as far back as 1972 Rolling Stone in his classic essay on Spacewar: Ready or not, computers are coming to the people.

At present, quantum gamings future is at a crossroads. Is it an obscure niche occupied by just a few gaming physics enthusiasts or a powerful tool that will shape tomorrows industry? Is it something that will teach us all to appreciate the finer points of quantum physics or a tool many of us wont even realize is being used, that will nevertheless give us some dope ass games to play?

Like Schrdingers cat, right now its both at once. What a superposition to be in.

See the original post:

Inside the weird, wild, and wondrous world of quantum video games - Digital Trends

Shaping the technology transforming our society – Fermi National Accelerator Laboratory

Technology and society are intertwined. Self-driving cars and facial recognition technologies are no longer science fiction, and data and efficiency are harbingers of this new world.

But these new technologies are only the beginning. In the coming decades, further advances in artificial intelligence and the dawn of quantum computing are poised to change lives in both discernible and inconspicuous ways.

Even everyday technology, like a smartphone app, affects people in significant ways that they might not realize, said Fermilab scientist Daniel Bowring. If there are concerns about something as familiar as an app, then we need to take more opaque and complicated technology, like AI, very seriously.

A two-day workshop took place from Oct. 31-Nov.1 at the University of Chicago to raise awareness and generate strategies for the ethical development and implementation of AI and quantum computing. The workshop was organized by the Chicago Quantum Exchange, a Chicago-based intellectual hub and community of researchers whose aim is to promote the exploration of quantum information technologies, and funded by the Kavli Foundation and the Center for Data and Computing, a University of Chicago center for research driven by data science and AI approaches.

Members of the Chicago Quantum Exchange engage in conversation at a workshop at the University of Chicago. Photo: Anne Ryan, University of Chicago

At the workshop, industry experts, physicists, sociologists, journalists and more gathered to learn, share insights and identify next steps as AI and quantum computing advance.

AI and quantum computing are developing tools that will affect everyone, said Bowring, a member of the workshop organizing team. It was important to us to get as many stakeholders in the room as possible.

Workshop participants listened to presentations that framed concerns such as power asymmetries, algorithmic bias and privacy before breaking out into small groups to deliberate these topics and develop actionable strategies. Groups reported to all attendees after each breakout session. On the last day of the workshop, participants considered how they would nurture the dialogue.

At one of the breakout sessions, participants discussed the balance between collaborative quantum computing research and national security. Today, the results of quantum computing research are dispersed in a wide variety of academic journals, and a lot of code is accessible and open source. However, because of its potential implications for cybersecurity and encryption, quantum computing is also of interest to national security, so it may be subject to intelligence and export controls. What endeavors, if any, should be open source or private? Are these outcomes realizable? What level of control should be maintained? How should these technologies be regulated?

Were already behind on setting ground rules for these technologies, which, if left to progress on their own, could increase power asymmetries in society, said Brian Nord, Fermilab and University of Chicago scientist and member of the workshop organizing team. Our research programs, for example, need to be crafted in a way that does not reinforce or exacerbate these asymmetries.

Workshop participants will continue the dialogue through online and in-person meetings to address key ethical and societal issues in the quantum and AI space. Potential future activities include writing proposals for joint research projects that consider ethical and societal implications, white papers addressed to academic audiences, and media editorials and developing community action plans.

Organizers are planning to hold a panel next spring to engage the public, as well.

The spring event will help us continue to spread awareness and engage a variety of groups on issues of ethics in AI and quantum computing, Nord said.

The workshop was sponsored by the Kavli Foundation in partnership with the Center for Data and Computing at the University of Chicago. Artificial intelligence and quantum information science are two of six initiatives identified as special priority by the Department of Energy Office of Science.

The Kavli Foundation is dedicated to advancing science for the benefit of humanity, promoting public understanding of scientific research, and supporting scientists and their work. The foundations mission is implemented through an international program of research institutes, initiatives and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics, as well as the Kavli Prize and a program in public engagement with science. Visitkavlifoundation.org.

The Chicago Quantum Exchange catalyzes research activity across disciplines and member institutions. It is anchored by the University of Chicago, Argonne National Laboratory, Fermi National Accelerator Laboratory, and the University of Illinois at Urbana-Champaign and includes the University of Wisconsin-Madison, Northwestern University and industry partners. Visit chicagoquantum.org.

Link:

Shaping the technology transforming our society - Fermi National Accelerator Laboratory

China is beating the US when it comes to quantum security – MIT Technology Review

Its been six years since hackers linked with China breached the US Office of Personnel Managements computer system and stole sensitive information about millions of federal employees and contractors. It was the sort of information thats collected during background checks for security clearancesvery personal stuff. But not all was lost. Even though there were obviously some massive holes in the OPMs security setup, some of its data was encrypted. It was useless to the attackers.

Perhaps not for much longer. Its only a matter of time before even encrypted data is at risk. Thats the view of John Prisco, CEO of Quantum Xchange, a cybersecurity firm based in Bethesda, Maryland. Speaking at the EmTech Future Compute event last week, he said that Chinas aggressive pursuit of quantum computing suggests it will eventually have a system capable of figuring out the key to access that data. Current encryption doesnt stand much of a chance against a quantum system tasked with breaking it.

China is moving forward with a harvest today, read tomorrow approach, said Prisco. The country wants to steal as much data as possible, even if it cant access it yet, because its banking on a future when it finally can, he said. Prisco says the China is outspending the US in quantum computing 10 times over. Its allegedly spending $10 billion alone to build the National Laboratory for Quantum Information Sciences, scheduled to open next year (although this number is disputed). Americas counterpunch is just $1.2 billion over five years toward quantum information science. Were not really that safe, he said.

Sign up for The Download your daily dose of what's up in emerging technology

Part of Chinas massive investment has gone toward quantum security itself, including the development of quantum key distribution, or QKD. This involves sending encrypted data as classical bits (strictly binary information) over a fiber-optic network, while sending the keys used to decrypt the information in the form of qubits (which can represent more than just two states, thanks to quantum superposition). The mere act of trying to observe the key changes its state, alerting the sender and receiver of a security breach.

Bu it has its limits. QKD requires sending information-carrying photons over incredibly long distances (tens to hundreds of miles). The best way to do this right now is by installing a fiber-optic network, a costly and time-consuming process.

Its not foolproof, either. The signals eventually scatter and break down over long stretches of fiber optics, so you need to build nodes that will continue to boost them forward. These networks are also point-to-point only (as opposed to a broadcast connection), so you can communicate with only one other party at a time.

Nevertheless, China looks to be all in on QKD networks. Its already built a 1,263-mile link between Beijing and Shanghai to deliver quantum keys. And a successful QKD demonstration by the Chinese Micius satellite was reported across the 4,700 miles between Beijing and Vienna.

Even Europe is making aggressive strides: the European Unions OPENQKD initiative calls for using a combination of fiber optics and satellites to create a QKD-safe communications network covering 13 nations. The US, Prisco argues, is incredibly far behind, for which he blames a lack of urgency. The closest thing it has is a 500-mile fiber-optic cable running down the East Coast. Quantum Xchange has inked a deal to use the cable to create a QKD network that secures data transfers for customers (most notably the financial companies based around New York City).

With Europe and China already taking QKD seriously, Prisco wants to see the US catch upand fast. Its a lot like the space race, he said. We really cant afford to come in second place.

Update: This story has been amended to note that the funding figures for the National Laboratory for Quantum Information Sciences are disputed among some experts.

Visit link:

China is beating the US when it comes to quantum security - MIT Technology Review

Technology to Highlight the Next 10 Years: Quantum Computing – Somag News

Technology to Highlight the Next 10 Years According to a Strategy Expert: Quantum Computing

It is said that quantum computers, quantum computing, will have an impact on human history in the coming years. Bank of Americas strategist said that quantum calculation will mark the 2020s.

Bank of America strategist Haim Israel, the revolutionary feature that will emerge in the 2020s will be quantum calculation, he said. The iPhone was released in 2007, and we felt its real impact in the 2010s. We will not see the first business applications for quantum computing until the end of the next decade.

Strategy expert Haim Israel; He stated that the effect of quantum computing on business will be more radical and revolutionary than the effect of smartphones. Lets take a closer look at quantum computing.

What is Quantum Calculation?

Quantum computation is a fairly new technology based on quantum theory in physics. Quantum theory, in the simplest way, describes the behavior of subatomic particles and states that these particles can exist in more than one place until they are observed. Quantum computers, like todays computers, go beyond the storage of zeros and get enormous computing power.

In October, Google, a subsidiary of Alphabet Inc., claimed that they completed the calculation in 200 seconds on a 53 qubit quantum computing chip using a quantum computer, which takes 10,000 years on the fastest supercomputer. Amazon said earlier this month that it intends to cooperate with experts to develop quantum computing technologies. IBM and Microsoft are also among the companies that develop quantum computing technologies.

Quantum computation; health services can recreate the Internet of objects and cyber security areas:

Israel; quantum computing would have revolutionary implications in areas such as health care, the Internet of things and cyber security. Pharmaceutical companies will be the first commercial users of these devices, he said, adding that only the quantum computers can solve the pharmaceutical industrys big data problem.

Quantum computing will also have a major impact on cyber security. Todays cyber security systems are based on cryptographic algorithms, but with quantum computing these equations can be broken in a very short time. Even the most powerful encryption algorithms in the future will weaken significantly by quantum computation, Ok said Oktas marketing manager, Swaroop Sham.

For investors, Israel said that the first one or two companies that could develop commercially applicable quantum computing in this field could access huge amounts of data. This makes the software of these companies very valuable for customers.

You may also like.

Read more here:

Technology to Highlight the Next 10 Years: Quantum Computing - Somag News

The Hits And Misses Of AWS re:Invent 2019 – Forbes

AWS re:Invent 2019 which concluded last week marked another milestone for Amazon and the cloud computing ecosystem. Some of the new AWS services announced this year will become the foundation for upcoming products and services.

Dart Board

Though there have been many surprises, AWS didnt mention or announce some of the services that were expected by the community. My own predictions for AWS re:Invent 2019 were partially accurate.

Based on the wishlist and what was expected, here is a list of hits and misses from this years mega cloud event:

Hits of AWS re:Invent 2019

1) Quantum Computing Delivered through Amazon Braket

After IBM, Microsoft, and Google, it was Amazons turn to jump the quantum computing bandwagon.

Amazon Braket is a managed service for quantum computing that provides a development environment to explore and design quantum algorithms, test them on simulated quantum computers, and run them on different quantum hardware technologies.

This new service from Amazon lets customers use both quantum and classical tasks on a hybrid infrastructure. It is tightly integrated with existing AWS services such as S3 and CloudWatch.

Amazon Braket has the potential to become one of the key pillars of AWS compute services.

2) Leveraging Project Nitro

Project Nitro is a collection of hardware accelerators that offload hypervisor, storage, and network to custom chips freeing up resources on EC2 to deliver the best performance.

Amazon has started to launch additional EC2 instance types based on custom chips powered by the Nitro System. The Inf1 family of EC2 delivers the best of the breed hardware and software combination to accelerate machine learning model inferencing.

Along with Nitro, Amazon is also investing in ARM-based compute resources. Amazon EC2 now offers general purpose (M6g), compute optimized (C6g), and memory optimized (R6g) Amazon instances powered by AWS Graviton2 processor that use 64-bit Arm Neoverse cores and custom silicon designed by AWS.

Going forward, Amazon will launch additional instance types based on Graviton2 processors that will become cheaper alternatives to Intel x64-based instance types.

3) Augmented AI with Human in the Loop

Remember Amazon Mechanical Turk (MTurk)? A crowdsourced service that delegates jobs to real humans. Based on the learnings from applying automation to retail, Amazon encourages keeping the human in the loop.

More recently, Amazon launched SageMaker Ground Truth - the data labeling service powered by humans. Customers can upload raw datasets and have humans draw bounding boxes around specific objects identified in the images. This increases accuracy while training machine learning models.

With Amazon Augmented AI (Amazon A2I), AWS now introduces human-driven validation of machine learning models. The low-confidence predictions from an augmented AI model are sent to real humans for validation. This increases the precision and accuracy of models while performing predictions from an ML model.

Amazon continues to bring humans into the technology-driven automation loop.

4) AI-driven Code Review and Profiling through Amazon CodeGuru

Amazon CodeGuru is a managed service that helps developers proactively improve code quality and application performance through AI-driven recommendations. The service comes with a reviewer and profiler that can detect and identify issues in code. Amazon CodeGuru can review and profile Java code targeting the Java Virtual Machine.

This service was expected to come from a platform and tools vendor. Given the heritage of developer tools, I was expecting this from Microsoft. But Amazon has taken a lead in infusing AI into code review and analysis.

CodeGuru is one of my favorite announcements from AWS re:Invent 2019.

5) Decentralized Cloud Infrastructure - Local Zones and AWS Wavelength

When the competition is caught up in expanding the footprint of data centers through traditional regions and zones, Amazon has taken an unconventional approach of setting up mini data centers in each metro.

The partnership with Verizon and other telecom providers is a great move from AWS.

Both, Local Zones and AWS Wavelength are game-changers from Amazon. They redefine edge computing by providing a continuum of compute services.

Bonus: AWS DeepComposer

Having launched DeepLens in 2017 and DeepRacer in 2018, I was curious to see how AWS mixes and matches its deep learning research with a hardware-based, educational device.

AWS DeepComposer brings the power of Generative Adversarial Networks (GAN) to music composition.

Misses of AWS re:Invent 2019

1) Open Source Strategy

Open source was conspicuously missing from the keynotes at re:Invent. With a veteran like Adrian Cockroft leading the open source efforts, I was expecting Amazon to make a significant announcement related to OSS.

Amazon has many internal projects which are good candidates for open source. From machine learning to compute infrastructure, AWS has many on-going research efforts. Open sourcing a tiny subset of these projects could immensely benefit the community.

The only open source project that was talked about was Firecracker which was announced last year. Even for that, Amazon didnt mention handing it over to a governing body to drive broader contribution and participation of the community.

The industry expects Amazon to actively participate in open source initiatives.

2) Container Strategy

Containers are the building blocks of modern infrastructure. They are becoming the de facto standard to build modern, cloud native applications.

With Amazon claiming that 80% of all containerized and Kubernetes applications running in the cloud run on AWS, I expect a streamlined developer experience of deploying containerized workloads on AWS.

The current developer experience of dealing with AWS container services such as ECS, Fargate and EKS leaves a lot to be desired.

The only significant announcement from re:Invent 2019 related to containers was the general availability of the serveless container platform based on EKS for Fargate. Based on my personal experience, I found the service to be complex.

Both Microsoft and Google score high on the innovation of containerized platforms and enhancing the developer experience.

AWS has work to do in simplifying the developer workflow when dealing with containerized workloads.

3) VMware Partnership

Surprisingly, there was no discussion on the roadmap, growth and adoption of VMware Cloud on AWS. While the focus shifted to AWS Outposts, there has been no mention of the upcoming AWS managed services on VMware.

Though AWS Outposts are available on vSphere, the GA announcement had little to no mention of Outposts on VMware.

4) Simplified Developer Experience

AWS now has multiple compute services in the form of EC2 (IaaS), Beanstalk (PaaS), Lambda (FaaS) and Container Services offered through ECS, Fargate and EKS (CaaS).

Amazon recommends using a variety of tools to manage the lifecycle of the infrastructure and applications. Customers use CloudFormation, Kubernetes YAML, Cloud Developer Kit (CDK) and Serverless Application Model (SAM) to deal with each of the workloads running in different compute environments.

The current deployment model and programmability aspects of AWS are becoming increasingly complex. There is a need to simplify the developer and admin experience of AWS.

I was expecting a new programmability model from Amazon that would make it easier for developers to target AWS for running their workloads.

5) Custom AutoML Models for Offline Usage

Though AWS launched SageMaker Autopilot and Rekognition Custom Labels in the AutoML domain, it didnt mention about enhancing AutoML-based language services for newer verticals and domains.

Custom models trained through Amazons AutoML services cannot be exported for offline usage in disconnected scenarios such as industrial automation. None of the services are integrated with AWS Greengrass deployments for offline inferencing.

Both Google and Microsoft offer exporting AutoML models optimized for the edge.

Amazon Comprehend service could be easily expanded to support newer verticals and domains such as legal and finance through AutoML.

Though the above announcements and services didnt make it to this years re:Invent, I am sure they are in the roadmap.

Read more:

The Hits And Misses Of AWS re:Invent 2019 - Forbes