Archive for the ‘Quantum Computing’ Category

Technology to Highlight the Next 10 Years: Quantum Computing – Somag News

Technology to Highlight the Next 10 Years According to a Strategy Expert: Quantum Computing

It is said that quantum computers, quantum computing, will have an impact on human history in the coming years. Bank of Americas strategist said that quantum calculation will mark the 2020s.

Bank of America strategist Haim Israel, the revolutionary feature that will emerge in the 2020s will be quantum calculation, he said. The iPhone was released in 2007, and we felt its real impact in the 2010s. We will not see the first business applications for quantum computing until the end of the next decade.

Strategy expert Haim Israel; He stated that the effect of quantum computing on business will be more radical and revolutionary than the effect of smartphones. Lets take a closer look at quantum computing.

What is Quantum Calculation?

Quantum computation is a fairly new technology based on quantum theory in physics. Quantum theory, in the simplest way, describes the behavior of subatomic particles and states that these particles can exist in more than one place until they are observed. Quantum computers, like todays computers, go beyond the storage of zeros and get enormous computing power.

In October, Google, a subsidiary of Alphabet Inc., claimed that they completed the calculation in 200 seconds on a 53 qubit quantum computing chip using a quantum computer, which takes 10,000 years on the fastest supercomputer. Amazon said earlier this month that it intends to cooperate with experts to develop quantum computing technologies. IBM and Microsoft are also among the companies that develop quantum computing technologies.

Quantum computation; health services can recreate the Internet of objects and cyber security areas:

Israel; quantum computing would have revolutionary implications in areas such as health care, the Internet of things and cyber security. Pharmaceutical companies will be the first commercial users of these devices, he said, adding that only the quantum computers can solve the pharmaceutical industrys big data problem.

Quantum computing will also have a major impact on cyber security. Todays cyber security systems are based on cryptographic algorithms, but with quantum computing these equations can be broken in a very short time. Even the most powerful encryption algorithms in the future will weaken significantly by quantum computation, Ok said Oktas marketing manager, Swaroop Sham.

For investors, Israel said that the first one or two companies that could develop commercially applicable quantum computing in this field could access huge amounts of data. This makes the software of these companies very valuable for customers.

You may also like.

Read more here:

Technology to Highlight the Next 10 Years: Quantum Computing - Somag News

The Hits And Misses Of AWS re:Invent 2019 – Forbes

AWS re:Invent 2019 which concluded last week marked another milestone for Amazon and the cloud computing ecosystem. Some of the new AWS services announced this year will become the foundation for upcoming products and services.

Dart Board

Though there have been many surprises, AWS didnt mention or announce some of the services that were expected by the community. My own predictions for AWS re:Invent 2019 were partially accurate.

Based on the wishlist and what was expected, here is a list of hits and misses from this years mega cloud event:

Hits of AWS re:Invent 2019

1) Quantum Computing Delivered through Amazon Braket

After IBM, Microsoft, and Google, it was Amazons turn to jump the quantum computing bandwagon.

Amazon Braket is a managed service for quantum computing that provides a development environment to explore and design quantum algorithms, test them on simulated quantum computers, and run them on different quantum hardware technologies.

This new service from Amazon lets customers use both quantum and classical tasks on a hybrid infrastructure. It is tightly integrated with existing AWS services such as S3 and CloudWatch.

Amazon Braket has the potential to become one of the key pillars of AWS compute services.

2) Leveraging Project Nitro

Project Nitro is a collection of hardware accelerators that offload hypervisor, storage, and network to custom chips freeing up resources on EC2 to deliver the best performance.

Amazon has started to launch additional EC2 instance types based on custom chips powered by the Nitro System. The Inf1 family of EC2 delivers the best of the breed hardware and software combination to accelerate machine learning model inferencing.

Along with Nitro, Amazon is also investing in ARM-based compute resources. Amazon EC2 now offers general purpose (M6g), compute optimized (C6g), and memory optimized (R6g) Amazon instances powered by AWS Graviton2 processor that use 64-bit Arm Neoverse cores and custom silicon designed by AWS.

Going forward, Amazon will launch additional instance types based on Graviton2 processors that will become cheaper alternatives to Intel x64-based instance types.

3) Augmented AI with Human in the Loop

Remember Amazon Mechanical Turk (MTurk)? A crowdsourced service that delegates jobs to real humans. Based on the learnings from applying automation to retail, Amazon encourages keeping the human in the loop.

More recently, Amazon launched SageMaker Ground Truth - the data labeling service powered by humans. Customers can upload raw datasets and have humans draw bounding boxes around specific objects identified in the images. This increases accuracy while training machine learning models.

With Amazon Augmented AI (Amazon A2I), AWS now introduces human-driven validation of machine learning models. The low-confidence predictions from an augmented AI model are sent to real humans for validation. This increases the precision and accuracy of models while performing predictions from an ML model.

Amazon continues to bring humans into the technology-driven automation loop.

4) AI-driven Code Review and Profiling through Amazon CodeGuru

Amazon CodeGuru is a managed service that helps developers proactively improve code quality and application performance through AI-driven recommendations. The service comes with a reviewer and profiler that can detect and identify issues in code. Amazon CodeGuru can review and profile Java code targeting the Java Virtual Machine.

This service was expected to come from a platform and tools vendor. Given the heritage of developer tools, I was expecting this from Microsoft. But Amazon has taken a lead in infusing AI into code review and analysis.

CodeGuru is one of my favorite announcements from AWS re:Invent 2019.

5) Decentralized Cloud Infrastructure - Local Zones and AWS Wavelength

When the competition is caught up in expanding the footprint of data centers through traditional regions and zones, Amazon has taken an unconventional approach of setting up mini data centers in each metro.

The partnership with Verizon and other telecom providers is a great move from AWS.

Both, Local Zones and AWS Wavelength are game-changers from Amazon. They redefine edge computing by providing a continuum of compute services.

Bonus: AWS DeepComposer

Having launched DeepLens in 2017 and DeepRacer in 2018, I was curious to see how AWS mixes and matches its deep learning research with a hardware-based, educational device.

AWS DeepComposer brings the power of Generative Adversarial Networks (GAN) to music composition.

Misses of AWS re:Invent 2019

1) Open Source Strategy

Open source was conspicuously missing from the keynotes at re:Invent. With a veteran like Adrian Cockroft leading the open source efforts, I was expecting Amazon to make a significant announcement related to OSS.

Amazon has many internal projects which are good candidates for open source. From machine learning to compute infrastructure, AWS has many on-going research efforts. Open sourcing a tiny subset of these projects could immensely benefit the community.

The only open source project that was talked about was Firecracker which was announced last year. Even for that, Amazon didnt mention handing it over to a governing body to drive broader contribution and participation of the community.

The industry expects Amazon to actively participate in open source initiatives.

2) Container Strategy

Containers are the building blocks of modern infrastructure. They are becoming the de facto standard to build modern, cloud native applications.

With Amazon claiming that 80% of all containerized and Kubernetes applications running in the cloud run on AWS, I expect a streamlined developer experience of deploying containerized workloads on AWS.

The current developer experience of dealing with AWS container services such as ECS, Fargate and EKS leaves a lot to be desired.

The only significant announcement from re:Invent 2019 related to containers was the general availability of the serveless container platform based on EKS for Fargate. Based on my personal experience, I found the service to be complex.

Both Microsoft and Google score high on the innovation of containerized platforms and enhancing the developer experience.

AWS has work to do in simplifying the developer workflow when dealing with containerized workloads.

3) VMware Partnership

Surprisingly, there was no discussion on the roadmap, growth and adoption of VMware Cloud on AWS. While the focus shifted to AWS Outposts, there has been no mention of the upcoming AWS managed services on VMware.

Though AWS Outposts are available on vSphere, the GA announcement had little to no mention of Outposts on VMware.

4) Simplified Developer Experience

AWS now has multiple compute services in the form of EC2 (IaaS), Beanstalk (PaaS), Lambda (FaaS) and Container Services offered through ECS, Fargate and EKS (CaaS).

Amazon recommends using a variety of tools to manage the lifecycle of the infrastructure and applications. Customers use CloudFormation, Kubernetes YAML, Cloud Developer Kit (CDK) and Serverless Application Model (SAM) to deal with each of the workloads running in different compute environments.

The current deployment model and programmability aspects of AWS are becoming increasingly complex. There is a need to simplify the developer and admin experience of AWS.

I was expecting a new programmability model from Amazon that would make it easier for developers to target AWS for running their workloads.

5) Custom AutoML Models for Offline Usage

Though AWS launched SageMaker Autopilot and Rekognition Custom Labels in the AutoML domain, it didnt mention about enhancing AutoML-based language services for newer verticals and domains.

Custom models trained through Amazons AutoML services cannot be exported for offline usage in disconnected scenarios such as industrial automation. None of the services are integrated with AWS Greengrass deployments for offline inferencing.

Both Google and Microsoft offer exporting AutoML models optimized for the edge.

Amazon Comprehend service could be easily expanded to support newer verticals and domains such as legal and finance through AutoML.

Though the above announcements and services didnt make it to this years re:Invent, I am sure they are in the roadmap.

Read more:

The Hits And Misses Of AWS re:Invent 2019 - Forbes