Archive for the ‘Quantum Computing’ Category

Shaping the technology transforming our society – Fermi National Accelerator Laboratory

Technology and society are intertwined. Self-driving cars and facial recognition technologies are no longer science fiction, and data and efficiency are harbingers of this new world.

But these new technologies are only the beginning. In the coming decades, further advances in artificial intelligence and the dawn of quantum computing are poised to change lives in both discernible and inconspicuous ways.

Even everyday technology, like a smartphone app, affects people in significant ways that they might not realize, said Fermilab scientist Daniel Bowring. If there are concerns about something as familiar as an app, then we need to take more opaque and complicated technology, like AI, very seriously.

A two-day workshop took place from Oct. 31-Nov.1 at the University of Chicago to raise awareness and generate strategies for the ethical development and implementation of AI and quantum computing. The workshop was organized by the Chicago Quantum Exchange, a Chicago-based intellectual hub and community of researchers whose aim is to promote the exploration of quantum information technologies, and funded by the Kavli Foundation and the Center for Data and Computing, a University of Chicago center for research driven by data science and AI approaches.

Members of the Chicago Quantum Exchange engage in conversation at a workshop at the University of Chicago. Photo: Anne Ryan, University of Chicago

At the workshop, industry experts, physicists, sociologists, journalists and more gathered to learn, share insights and identify next steps as AI and quantum computing advance.

AI and quantum computing are developing tools that will affect everyone, said Bowring, a member of the workshop organizing team. It was important to us to get as many stakeholders in the room as possible.

Workshop participants listened to presentations that framed concerns such as power asymmetries, algorithmic bias and privacy before breaking out into small groups to deliberate these topics and develop actionable strategies. Groups reported to all attendees after each breakout session. On the last day of the workshop, participants considered how they would nurture the dialogue.

At one of the breakout sessions, participants discussed the balance between collaborative quantum computing research and national security. Today, the results of quantum computing research are dispersed in a wide variety of academic journals, and a lot of code is accessible and open source. However, because of its potential implications for cybersecurity and encryption, quantum computing is also of interest to national security, so it may be subject to intelligence and export controls. What endeavors, if any, should be open source or private? Are these outcomes realizable? What level of control should be maintained? How should these technologies be regulated?

Were already behind on setting ground rules for these technologies, which, if left to progress on their own, could increase power asymmetries in society, said Brian Nord, Fermilab and University of Chicago scientist and member of the workshop organizing team. Our research programs, for example, need to be crafted in a way that does not reinforce or exacerbate these asymmetries.

Workshop participants will continue the dialogue through online and in-person meetings to address key ethical and societal issues in the quantum and AI space. Potential future activities include writing proposals for joint research projects that consider ethical and societal implications, white papers addressed to academic audiences, and media editorials and developing community action plans.

Organizers are planning to hold a panel next spring to engage the public, as well.

The spring event will help us continue to spread awareness and engage a variety of groups on issues of ethics in AI and quantum computing, Nord said.

The workshop was sponsored by the Kavli Foundation in partnership with the Center for Data and Computing at the University of Chicago. Artificial intelligence and quantum information science are two of six initiatives identified as special priority by the Department of Energy Office of Science.

The Kavli Foundation is dedicated to advancing science for the benefit of humanity, promoting public understanding of scientific research, and supporting scientists and their work. The foundations mission is implemented through an international program of research institutes, initiatives and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics, as well as the Kavli Prize and a program in public engagement with science. Visitkavlifoundation.org.

The Chicago Quantum Exchange catalyzes research activity across disciplines and member institutions. It is anchored by the University of Chicago, Argonne National Laboratory, Fermi National Accelerator Laboratory, and the University of Illinois at Urbana-Champaign and includes the University of Wisconsin-Madison, Northwestern University and industry partners. Visit chicagoquantum.org.

Link:

Shaping the technology transforming our society - Fermi National Accelerator Laboratory

China is beating the US when it comes to quantum security – MIT Technology Review

Its been six years since hackers linked with China breached the US Office of Personnel Managements computer system and stole sensitive information about millions of federal employees and contractors. It was the sort of information thats collected during background checks for security clearancesvery personal stuff. But not all was lost. Even though there were obviously some massive holes in the OPMs security setup, some of its data was encrypted. It was useless to the attackers.

Perhaps not for much longer. Its only a matter of time before even encrypted data is at risk. Thats the view of John Prisco, CEO of Quantum Xchange, a cybersecurity firm based in Bethesda, Maryland. Speaking at the EmTech Future Compute event last week, he said that Chinas aggressive pursuit of quantum computing suggests it will eventually have a system capable of figuring out the key to access that data. Current encryption doesnt stand much of a chance against a quantum system tasked with breaking it.

China is moving forward with a harvest today, read tomorrow approach, said Prisco. The country wants to steal as much data as possible, even if it cant access it yet, because its banking on a future when it finally can, he said. Prisco says the China is outspending the US in quantum computing 10 times over. Its allegedly spending $10 billion alone to build the National Laboratory for Quantum Information Sciences, scheduled to open next year (although this number is disputed). Americas counterpunch is just $1.2 billion over five years toward quantum information science. Were not really that safe, he said.

Sign up for The Download your daily dose of what's up in emerging technology

Part of Chinas massive investment has gone toward quantum security itself, including the development of quantum key distribution, or QKD. This involves sending encrypted data as classical bits (strictly binary information) over a fiber-optic network, while sending the keys used to decrypt the information in the form of qubits (which can represent more than just two states, thanks to quantum superposition). The mere act of trying to observe the key changes its state, alerting the sender and receiver of a security breach.

Bu it has its limits. QKD requires sending information-carrying photons over incredibly long distances (tens to hundreds of miles). The best way to do this right now is by installing a fiber-optic network, a costly and time-consuming process.

Its not foolproof, either. The signals eventually scatter and break down over long stretches of fiber optics, so you need to build nodes that will continue to boost them forward. These networks are also point-to-point only (as opposed to a broadcast connection), so you can communicate with only one other party at a time.

Nevertheless, China looks to be all in on QKD networks. Its already built a 1,263-mile link between Beijing and Shanghai to deliver quantum keys. And a successful QKD demonstration by the Chinese Micius satellite was reported across the 4,700 miles between Beijing and Vienna.

Even Europe is making aggressive strides: the European Unions OPENQKD initiative calls for using a combination of fiber optics and satellites to create a QKD-safe communications network covering 13 nations. The US, Prisco argues, is incredibly far behind, for which he blames a lack of urgency. The closest thing it has is a 500-mile fiber-optic cable running down the East Coast. Quantum Xchange has inked a deal to use the cable to create a QKD network that secures data transfers for customers (most notably the financial companies based around New York City).

With Europe and China already taking QKD seriously, Prisco wants to see the US catch upand fast. Its a lot like the space race, he said. We really cant afford to come in second place.

Update: This story has been amended to note that the funding figures for the National Laboratory for Quantum Information Sciences are disputed among some experts.

Visit link:

China is beating the US when it comes to quantum security - MIT Technology Review

Technology to Highlight the Next 10 Years: Quantum Computing – Somag News

Technology to Highlight the Next 10 Years According to a Strategy Expert: Quantum Computing

It is said that quantum computers, quantum computing, will have an impact on human history in the coming years. Bank of Americas strategist said that quantum calculation will mark the 2020s.

Bank of America strategist Haim Israel, the revolutionary feature that will emerge in the 2020s will be quantum calculation, he said. The iPhone was released in 2007, and we felt its real impact in the 2010s. We will not see the first business applications for quantum computing until the end of the next decade.

Strategy expert Haim Israel; He stated that the effect of quantum computing on business will be more radical and revolutionary than the effect of smartphones. Lets take a closer look at quantum computing.

What is Quantum Calculation?

Quantum computation is a fairly new technology based on quantum theory in physics. Quantum theory, in the simplest way, describes the behavior of subatomic particles and states that these particles can exist in more than one place until they are observed. Quantum computers, like todays computers, go beyond the storage of zeros and get enormous computing power.

In October, Google, a subsidiary of Alphabet Inc., claimed that they completed the calculation in 200 seconds on a 53 qubit quantum computing chip using a quantum computer, which takes 10,000 years on the fastest supercomputer. Amazon said earlier this month that it intends to cooperate with experts to develop quantum computing technologies. IBM and Microsoft are also among the companies that develop quantum computing technologies.

Quantum computation; health services can recreate the Internet of objects and cyber security areas:

Israel; quantum computing would have revolutionary implications in areas such as health care, the Internet of things and cyber security. Pharmaceutical companies will be the first commercial users of these devices, he said, adding that only the quantum computers can solve the pharmaceutical industrys big data problem.

Quantum computing will also have a major impact on cyber security. Todays cyber security systems are based on cryptographic algorithms, but with quantum computing these equations can be broken in a very short time. Even the most powerful encryption algorithms in the future will weaken significantly by quantum computation, Ok said Oktas marketing manager, Swaroop Sham.

For investors, Israel said that the first one or two companies that could develop commercially applicable quantum computing in this field could access huge amounts of data. This makes the software of these companies very valuable for customers.

You may also like.

Read more here:

Technology to Highlight the Next 10 Years: Quantum Computing - Somag News

The Hits And Misses Of AWS re:Invent 2019 – Forbes

AWS re:Invent 2019 which concluded last week marked another milestone for Amazon and the cloud computing ecosystem. Some of the new AWS services announced this year will become the foundation for upcoming products and services.

Dart Board

Though there have been many surprises, AWS didnt mention or announce some of the services that were expected by the community. My own predictions for AWS re:Invent 2019 were partially accurate.

Based on the wishlist and what was expected, here is a list of hits and misses from this years mega cloud event:

Hits of AWS re:Invent 2019

1) Quantum Computing Delivered through Amazon Braket

After IBM, Microsoft, and Google, it was Amazons turn to jump the quantum computing bandwagon.

Amazon Braket is a managed service for quantum computing that provides a development environment to explore and design quantum algorithms, test them on simulated quantum computers, and run them on different quantum hardware technologies.

This new service from Amazon lets customers use both quantum and classical tasks on a hybrid infrastructure. It is tightly integrated with existing AWS services such as S3 and CloudWatch.

Amazon Braket has the potential to become one of the key pillars of AWS compute services.

2) Leveraging Project Nitro

Project Nitro is a collection of hardware accelerators that offload hypervisor, storage, and network to custom chips freeing up resources on EC2 to deliver the best performance.

Amazon has started to launch additional EC2 instance types based on custom chips powered by the Nitro System. The Inf1 family of EC2 delivers the best of the breed hardware and software combination to accelerate machine learning model inferencing.

Along with Nitro, Amazon is also investing in ARM-based compute resources. Amazon EC2 now offers general purpose (M6g), compute optimized (C6g), and memory optimized (R6g) Amazon instances powered by AWS Graviton2 processor that use 64-bit Arm Neoverse cores and custom silicon designed by AWS.

Going forward, Amazon will launch additional instance types based on Graviton2 processors that will become cheaper alternatives to Intel x64-based instance types.

3) Augmented AI with Human in the Loop

Remember Amazon Mechanical Turk (MTurk)? A crowdsourced service that delegates jobs to real humans. Based on the learnings from applying automation to retail, Amazon encourages keeping the human in the loop.

More recently, Amazon launched SageMaker Ground Truth - the data labeling service powered by humans. Customers can upload raw datasets and have humans draw bounding boxes around specific objects identified in the images. This increases accuracy while training machine learning models.

With Amazon Augmented AI (Amazon A2I), AWS now introduces human-driven validation of machine learning models. The low-confidence predictions from an augmented AI model are sent to real humans for validation. This increases the precision and accuracy of models while performing predictions from an ML model.

Amazon continues to bring humans into the technology-driven automation loop.

4) AI-driven Code Review and Profiling through Amazon CodeGuru

Amazon CodeGuru is a managed service that helps developers proactively improve code quality and application performance through AI-driven recommendations. The service comes with a reviewer and profiler that can detect and identify issues in code. Amazon CodeGuru can review and profile Java code targeting the Java Virtual Machine.

This service was expected to come from a platform and tools vendor. Given the heritage of developer tools, I was expecting this from Microsoft. But Amazon has taken a lead in infusing AI into code review and analysis.

CodeGuru is one of my favorite announcements from AWS re:Invent 2019.

5) Decentralized Cloud Infrastructure - Local Zones and AWS Wavelength

When the competition is caught up in expanding the footprint of data centers through traditional regions and zones, Amazon has taken an unconventional approach of setting up mini data centers in each metro.

The partnership with Verizon and other telecom providers is a great move from AWS.

Both, Local Zones and AWS Wavelength are game-changers from Amazon. They redefine edge computing by providing a continuum of compute services.

Bonus: AWS DeepComposer

Having launched DeepLens in 2017 and DeepRacer in 2018, I was curious to see how AWS mixes and matches its deep learning research with a hardware-based, educational device.

AWS DeepComposer brings the power of Generative Adversarial Networks (GAN) to music composition.

Misses of AWS re:Invent 2019

1) Open Source Strategy

Open source was conspicuously missing from the keynotes at re:Invent. With a veteran like Adrian Cockroft leading the open source efforts, I was expecting Amazon to make a significant announcement related to OSS.

Amazon has many internal projects which are good candidates for open source. From machine learning to compute infrastructure, AWS has many on-going research efforts. Open sourcing a tiny subset of these projects could immensely benefit the community.

The only open source project that was talked about was Firecracker which was announced last year. Even for that, Amazon didnt mention handing it over to a governing body to drive broader contribution and participation of the community.

The industry expects Amazon to actively participate in open source initiatives.

2) Container Strategy

Containers are the building blocks of modern infrastructure. They are becoming the de facto standard to build modern, cloud native applications.

With Amazon claiming that 80% of all containerized and Kubernetes applications running in the cloud run on AWS, I expect a streamlined developer experience of deploying containerized workloads on AWS.

The current developer experience of dealing with AWS container services such as ECS, Fargate and EKS leaves a lot to be desired.

The only significant announcement from re:Invent 2019 related to containers was the general availability of the serveless container platform based on EKS for Fargate. Based on my personal experience, I found the service to be complex.

Both Microsoft and Google score high on the innovation of containerized platforms and enhancing the developer experience.

AWS has work to do in simplifying the developer workflow when dealing with containerized workloads.

3) VMware Partnership

Surprisingly, there was no discussion on the roadmap, growth and adoption of VMware Cloud on AWS. While the focus shifted to AWS Outposts, there has been no mention of the upcoming AWS managed services on VMware.

Though AWS Outposts are available on vSphere, the GA announcement had little to no mention of Outposts on VMware.

4) Simplified Developer Experience

AWS now has multiple compute services in the form of EC2 (IaaS), Beanstalk (PaaS), Lambda (FaaS) and Container Services offered through ECS, Fargate and EKS (CaaS).

Amazon recommends using a variety of tools to manage the lifecycle of the infrastructure and applications. Customers use CloudFormation, Kubernetes YAML, Cloud Developer Kit (CDK) and Serverless Application Model (SAM) to deal with each of the workloads running in different compute environments.

The current deployment model and programmability aspects of AWS are becoming increasingly complex. There is a need to simplify the developer and admin experience of AWS.

I was expecting a new programmability model from Amazon that would make it easier for developers to target AWS for running their workloads.

5) Custom AutoML Models for Offline Usage

Though AWS launched SageMaker Autopilot and Rekognition Custom Labels in the AutoML domain, it didnt mention about enhancing AutoML-based language services for newer verticals and domains.

Custom models trained through Amazons AutoML services cannot be exported for offline usage in disconnected scenarios such as industrial automation. None of the services are integrated with AWS Greengrass deployments for offline inferencing.

Both Google and Microsoft offer exporting AutoML models optimized for the edge.

Amazon Comprehend service could be easily expanded to support newer verticals and domains such as legal and finance through AutoML.

Though the above announcements and services didnt make it to this years re:Invent, I am sure they are in the roadmap.

Read more:

The Hits And Misses Of AWS re:Invent 2019 - Forbes