Archive for the ‘Quantum Computing’ Category

Google’s top quantum computing brain may or may not have quit – Fudzilla

We will know when someone opens his office door

John Martinis, who had established Googles quantum hardware group in 2014, has cleaned out his office, put the cats out and left the building.

Martinis says a few months after he got Googles now legendary quantum computing experiment to go he was reassigned from a leadership position to an advisory one.

Martinis told Wired that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.

Martinis said he had to go because his professional goal is for someone to build a quantum computer.

Google has not disputed this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project.

Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.

To be fair, Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, and got enough cats together.

The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.

Googles ground-breaking supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer 10,000 years to work out. It still does not have a practical use, and the cats were said to be bored with the whole thing.

Read more here:
Google's top quantum computing brain may or may not have quit - Fudzilla

RMACC’s 10th High Performance Computing Symposium to Be Held Free Online – HPCwire

BOULDER, Colo., April 22, 2020 The Rocky Mountain Advanced Computing Consortium (RMACC) will hold its 10thannual High Performance Computing Symposium as a multi-track on-line version on May 20-21.Registration for the event will be free to all who would like to attend.

The on-line Symposium will include presentations by two keynote speakers and a full slate of tutorial sessions.Another longtime Symposium tradition a poster competition for students to showcase their own research also will be continued. Competition winners will receive an all-expenses paid trip to SC20 in Atlanta.

Major sponsor support is being provided by Intel, Dell and HPE with additional support from ARM, IBM, Lenovo and Silicon Mechanics.

Links to the Symposium registration, its schedule, and how to enter the poster competition can be found atwww.rmacc.org/hpcsymposium.

The Keynote speakers areDr.Nick Bronn, a Research Staff Member in IBMs Experimental Quantum Computing group, andDr. Jason Dexter, a working group coordinator for the groundbreaking black hole imaging studies published by Event Horizon Telescope.

Dr. Bronn serves at IBMs TJ Watson Research Center in Yorktown Heights, NY.He has been responsible for qubit (quantum bits) device design, packaging, and cryogenic measurement, working towards scaling up larger numbers of qubits on a device and integration with novel implementations of microwave and cryogenic hardware.He will speak on the topic,Benchmarking and Enabling Noisy Near-term Quantum Hardware.

Dr.Dexter is a member of the astrophysical and planetary sciences faculty at the University of Colorado Boulder.He will speak on the role of high performance computing in understanding what we see in the first image of a black hole.Dr. Dexter is a member of both the Event Horizon Telescope and VLTI/GRAVITY collaborations, which can now image black holes.

Their appearances along with the many tutorial sessions continue the RMACCs annual tradition of showcasing cutting-edge HPC achievements in both education and industry.

The largest consortium of its kind, the RMACC is a collaboration among 30 academic and government research institutions in Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, Washington and Wyoming. The consortiums mission is to facilitate widespread effective use of high performance computing throughout the 9-state intermountain region.

More about the RMACC and its mission can be found at the website:www.rmacc.org.

About RMACC

Primarily a volunteer organization, the RMACC is collaboration among 30 academic and research institutions located in Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, Washington and Wyoming.The RMACCs mission is to facilitate widespread effective use of high performance computing throughout this 9-state intermountain region.

Source: RMACC

See more here:
RMACC's 10th High Performance Computing Symposium to Be Held Free Online - HPCwire

Advanced Encryption Standard (AES): What It Is and How It Works – Hashed Out by The SSL Store – Hashed Out by The SSL Store

Understanding advanced encryption standard on basic level doesnt require a higher degree in computer science or Matrix-level consciousness lets break AES encryption down into laymans terms

Hey, all. We know of security of information to be a hot topic since, well, forever. We entrust our personal and sensitive information to lots of major entities and still have problems with data breaches, data leaks, etc. Some of this happens because of security protocols in networking, or bad practices of authentication management but, really, there are many ways that data breaches can occur. However, the actual process of decrypting a ciphertext without a key is far more difficult. For that, we can thank the encrypting algorithms like the popular advanced encryption standard and the secure keys that scramble our data into indecipherable gibberish.

Lets look into how AES works and different applications for it. Well be getting a little into some Matrix-based math so, grab your red pills and see how far this rabbit hole goes.

Lets hash it out.

You may have heard of advanced encryption standard, or AES for short but may not know the answer to the question what is AES? Here are four things you need to know about AES:

The National Institute of Standards and Technology (NIST) established AES as an encryption standard nearly 20 years ago to replace the aging data encryption standard (DES). After all, AES encryption keys can go up to 256 bits, whereas DES stopped at just 56 bits. NIST could have chosen a cipher that offered greater security, but the tradeoff would have required greater overhead that wouldnt be practical. So, they went with one that had great all-around performance and security.

AESs results are so successful that many entities and agencies have approved it and utilize it for encrypting sensitive information. The National Security Agency (NSA), as well as other governmental bodies, utilize AES encryption and keys to protect classified or other sensitive information. Furthermore, AES is often included in commercial based products, including but limited to:

Although it wouldnt literally take forever, it would take far longer than any of our lifetimes to crack an AES 256-bit encryption key using modern computing technology. This is from a brute force standpoint, as in trying every combination until we hear the click/unlocking sound. Certain protections are put in place to prevent stuff from like this happening quickly, such as a limit on password attempts before a lockdown, which may or may not include a time lapse, to occur before trying again. When we are dealing with computation in milliseconds, waiting 20 minutes to try another five times would seriously add to the time taken to crack a key.

Just how long would it take? We are venturing into a thousand monkeys working on a thousand typewriters to write A Tale of Two Cities territory. The possible combinations for AES 256-bit encryption is 2256. Even if a computer can do multiple quadrillions of instructions per second, then we are still in that eagles-wings-eroding-Mount-Everest time frame.

Needless to say, its waaaaaaaaaaaaaaaaaaay (theres not enough memory on our computers to support the number of a letters that I want to convey) longer than our current universe has been in existence. And thats just for a 16-byte block of data. So, as you can see, brute forcing AES even if it is 128 bits AES is futile.

That would likely change, though, once quantum computing becomes a little more mainstream, available, and effective. Quantum computing is expected to break AES encryption and require other methods to protect our data but thats still a ways down the road.

Manage Digital Certificates like a Boss

14 Certificate Management Best Practices to keep your organization running, secure and fully-compliant.

To better understand what AES is, you need to understand how it works. But in order to see how the advanced encryption standard actually works, however, we first need to look at how this is set up and the rules concerning the process based on the users selection of encryption strength. Typically, when we discuss using higher bit levels of security, were looking at things that are more secure and more difficult to break or hack. While the data blocks are broken up into 128 bits, the key size have a few varying lengths: 128 bits, 196 bits, and 256 bits. What does this mean? Lets back it up for a second here.

We know that encryption typically deals in the scrambling of information into something unreadable and an associated key to decrypt the scramble. AES scramble procedures use four scrambling operations in rounds, meaning that it will perform the operations, and then repeat the process based off of the previous rounds results X number of times. Simplistically, if we put in X and get out Y, that would be one round. We would then put Y through the paces and get out Z for round 2. Rinse and repeat until we have completed the specified number of rounds.

The AES key size, specified above, will determine the number of rounds that the procedure will execute. For example:

As mentioned, each round has four operations.

So, youve arrived this far. Now, you may be asking: why, oh why, didnt I take the blue pill?

Before we get to the operational parts of advanced encryption standard, lets look at how the data is structured. What we mean is that the data that the operations are performed upon is not left-to-right sequential as we normally think of it. Its stacked in a 44 matrix of 128 bits (16 bytes) per block in an array thats known as a state. A state looks something like this:

So, if your message was blue pill or red, it would look something like this:

So, just to be clear, this is just a 16-byte block so, this means that every group of 16 bytes in a file are arranged in such a fashion. At this point, the systematic scramble begins through the application of each AES encryption operation.

As mentioned earlier, once we have our data arrangement, there are certain linked operations that will perform the scramble on each state. The purpose here is to convert the plaintext data into ciphertext through the use of a secret key.

The four types of AES operations as follows (note: well get into the order of the operations in the next section):

As mentioned earlier, the key size determines the number of rounds of scrambling that will be performed. AES encryption uses the Rjindael Key Schedule, which derives the subkeys from the main key to perform the Key Expansion.

The AddRoundKey operation takes the current state of the data and executes the XOR Boolean operation against the current round subkey. XOR means Exclusively Or, which will yield a result of true if the inputs differ (e.g. one input must be 1 and the other input must be 0 to be true). There will be a unique subkey per round, plus one more (which will run at the end).

The SubBytes operation, which stands for substitute bytes, will take the 16-byte block and run it through an S-Box (substitution box) to produce an alternate value. Simply put, the operation will take a value and then replace it by spitting out another value.

The actual S-Box operation is a complicated process, but just know that its nearly impossible to decipher with conventional computing. Coupled with the rest of AES operations, it will do its job to effectively scramble and obfuscate the source data. The S in the white box in the image above represents the complex lookup table for the S-Box.

The ShiftRows operation is a little more straightforward and is easier to understand. Based off the arrangement of the data, the idea of ShiftRows is to move the positions of the data in their respective rows with wrapping. Remember, the data is arranged in a stacked arrangement and not left to right like most of us are used to reading. The image provided helps to visualize this operation.

The first row goes unchanged. The second row shifts the bytes to the left by one position with row wrap around. The third row shifts the bytes one position beyond that, moving the byte to the left by a total of two positions with row wrap around. Likewise, this means that the fourth row shifts the bytes to the left by a total of three positions with row wrap around.

The MixColumns operation, in a nutshell, is a linear transformation of the columns of the dataset. It uses matrix multiplication and bitwise XOR addition to output the results. The column data, which can be represented as a 41 matrix, will be multiplied against a 44 matrix in a format called the Gallois field, and set as an inverse of input and output. That will look something like the following:

As you can see, there are four bytes in that are ran against a 44 matrix. In this case, matrix multiplication has each input byte affecting each output byte and, obviously, yields the same size.

Now that we have a decent understanding of the different operations utilized to scramble our data via AES encryption, we can look at the order in which these operations execute. It will be as such:

Note: The MixColumns operation is not in the final round. Without getting into the actual math of this, theres no additional benefit to performing this operation. In fact, doing so would simply make the decryption process a bit more taxing in terms of overhead.

If we consider the number of rounds and the operations per round that are involved, by the end of it, you should have a nice scrambled block. And that is only a 16-byte block. Consider how much information that equates to in the big picture. Its miniscule when compared to todays file/packet sizes! So, if each 16-byte block has seemingly no discernable pattern at least, any pattern that can be deciphered in a timely manner Id say AES has done its job.

We know the advanced encryption standard algorithm itself is quite effective, but its level of effectiveness depends on how its implemented. Unlike the brute force attacks mentioned above, effective attacks are typically launched on the implementation and not on the algorithm itself. This can be equated to attacking users as in phishing attacks versus attacking the technology behind the service/function that may be hard to breach. These can be considered side-channel attacks where the attacks are being carried out on other aspects of the entire process and not the focal point of the security implementation.

While I always advocate going with a reasonable/effective security option, a lot of AES encryption is happening without you even knowing it. Its locking down spots of the computing world that would otherwise be wide open. In other words, there would be many more opportunities for hackers to capture data if advanced encryption standard wasnt implemented at all. We just need to know how to identify the open holes and figure out how to plug them. Some may be able to use AES and others may need another protocol or process.

Appreciate the encryption implementations we have, use the best ones when needed, and happy scrutinizing!

Link:
Advanced Encryption Standard (AES): What It Is and How It Works - Hashed Out by The SSL Store - Hashed Out by The SSL Store

The qubit schrdinger will replace if a quantum computer traditional PCs – The KXAN 36 News

the Russian developer of electronics and robotics Khamster robotics, introduced the first domestic mini PC processor Baikal and operating system Alt Linux. On the one hand, this event aroused considerable interest from technical experts, and other government agencies to whom it is daunting task to switch to Russian software. However, even back in 2010, Steve jobs made the revolutionary statement that the personal computer is dead. Then it was supported by many authoritative publications and experts in the field of IT, in a voice announced the death of the PC. Indeed, according to Gartner, this market segment is rapidly declining for the seventh consecutive year. Compared to 2011 sales volume of desktop computers fell by 30%, reaching in 2018 only 259 million units a figure that is comparable to the distant 2006. Users increasingly demand the combination of functionality, design and performance. So, according to experts, against the background of General decline in sales of devices only 2-in-1 ultra-thin laptops will show a positive growth in the near future.

However, you cannot deny that over the last decade the personal computer has evolved greatly. Laptops, tablets and smartphones have superseded it as the base unit for work, casual gaming and web surfing, but a desktop computer is indispensable to perform specialized functions. Thanks to more powerful processor and options for customization of the PC is better suited to working with graphics, run heavy games out there, carrying out complex calculations and other functions workstation.

Breakthrough in quantum technologies has proved once again that the computer isnt going to give up and completely give in to their mobile counterparts. Google quantum processor, presented in September of last year, just three and a half minutes to cope with the task, which have the most advanced supercomputer would have left about ten thousand years. About what this means for the average consumer and in what direction will develop the technology, said the experts.

Quantum computer, no matter what skeptics already exists. Albeit this is not the most productive computing device, but the quantum superiority was demonstrated. Google created a superconducting quantum simulator for the selected algorithm showed much better performance than the most powerful classical computer. Experimental research on the creation of a quantum computer is conducted in many research centers around the world. Huge financial resources invested in this area, and basic prohibitions on creating twhat kind of cars not only in achieving a certain technological level, said General Director of concern Automatics of the state Corporation rostec Vladimir Kabanov.

the Term quantum advantage is the ability of quantum computing to solve problems inaccessible to classical computers, regardless of use and practical applicability of these results. To perform its calculations, a quantum computer uses complicated phenomena of quantum mechanics quantum entanglement and superposition, explained the head of the laboratory of cryptography JSC SPC Kryptonite (included in X holding) Vasily Shishkin.

the Potential superiority of quantum computer over classical is that the quantum computer operates not ordinary bats, and quantum qubits. Unlike bits, which in each moment can only be in one of two States 0 or 1, qubits take both these values with a certain probability. This phenomenon is called quantum superposition, said Shishkin. Due to their characteristics, the qubits can carry much more information, which drastically increases the computational power of a quantum computer.

However, the expert said, the development of such devices raises a number of practical difficulties: as soon As the user reads the value of a qubit, it loses its quantum properties and turns into an ordinary bit with one constant value. Therefore, the input data is written in the form of a system of qubits, and the calculation without measuring their values. Once the values of the qubits are read, calculations stop.

in addition, when you create a quantum computer it is necessary to consider the phenomenon of quantum entanglement. This means that the qubits should be in a dependent state. For example, if the measurement of one qubit we get the value 1, then the result of the measurement of all associated qubit will give 0. The main current technological problem is that systems of coupled qubits is very unstable and quickly lead to errors. And the more qubits, the shorter the period of stable operation, said the expert.

This feature explains why in the existing quantum computers are so few qubits. However, each year this number is increasing: for example, in the first quantum computer, tested IBM in 2001, there were only seven qubits, and submitted to Google in 2019 quantum processor Sycamore 53.

Indeed, the last three years there is double the annual increase in the number of qubits in quantum computers, so the technology has great promise, says Sergey Chirkin, Dean of the faculties of artificial intelligence and Big Data Analytics GeekUniversity, educational portal GeekBrains: In quantum computing is expected to at least annual small breakthroughs, which ultimately should lead to the fact that a significant part of the computation for artificial intelligence will be run on quantum computers. This can happen in the next ten years, provided that investment in research will grow and will increase the number of developers of quantum computers. Technology of this level will usually be available everywhere: access to a quantum computer, you can get as part of the cloud service.

Technology, a quantum computer will require improvement, says the co-founder of the company Crown, doctor of physical Sciences Ivan Atkin. Separate scientific breakthroughs happen regularly, but this is the beginning of a journey the scientists just realized that this is possible. In the next five years this quantum breakthrough is definitely not happening, the first success will be only in ten years. Thus, the technology of artificial intelligence was established more than 50 years ago, but the real use began recently. While the AI is much more simple technology than a quantum computer, predicts scientist.

Evolution of computer demonstration: breakthrough for its time, a development first introduced in specialized areas and then you get a wide domestic distribution. So, the first prototypes of the computer, which appeared in 1940-ies, were used exclusively for military and scientific purposes. They occupied entire rooms, weighed tens of tons and could carry up to several thousand operations per second. The era of the personal computer began only in 1980-ies due to the Mac Steve jobs. The device was worth two and a half thousand dollars, weighed a little less than ten pounds and could be operated even by a child. By the time jobs announced the death of the PC in 2010, it became an integral part of the lives of most people. Probably eventually the same thing will happen with a quantum computer. However, according to experts, the ordinary consumer should not expect personal quantum devices at least the next ten years.

in the meantime, all work in this area are of exploratory character and the results needed in the first place scientific and technological community, says the rector of Innopolis University, Advisor of the Russian quantum center Alexander Tormasov. The industry is just beginning to be interested in quantum technologies. The Russian quantum center, we are discussing the creation of operating systems for work with quantum computers. Also in the next ten years the market can reach the quantum devices to measure time with high accuracy. This will be the impetus for further technological development, the as we will be able to receive a satellite positioning system, which will increase the scale of recognition with 100 meters to 30 centimeters. Then we from space to see the steps of the person, concluded the expert.

Read the rest here:
The qubit schrdinger will replace if a quantum computer traditional PCs - The KXAN 36 News

14 Tech Pros Predict The ‘Next Big Thing’ In Cybersecurity And Encryption – Forbes

Cybersecurity is a constant arms race. Because of its continuous evolution, what firms have solved for today might be obsolete by tomorrow. But unfortunately, many media outlets don't focus on the technical innovations of the industry and prefer to look at the failures of cybersecurity.

Instead of covering how encryption technology has evolved, media outlets cover how hackers have bypassed security measures. As a result, people may be less aware of what the current and future trends of cybersecurity and encryption are. To help educate and inform others, 14 experts from Forbes Technology Council explore the latest innovations and trends coming soon in the world of cybersecurity and why they are important.

1. Security By Design

Over the first 20 years of the internet age, security and encryption have been add-on products on top of systems built without it. For the next 20 years we'll see security from the ground up go mainstream. An example in today's world is the Apple iOS for iPhone/iPad or Microsoft Windows 10 with BitLocker and Defender enabled. - Bret Piatt, Jungle Disk

2. Proxy Re-Encryption

Fueled by the rise of distributed applications, new advancements in encryption technologies allow private data to be stored on public, decentralized networks. Exciting developments in proxy re-encryption (PRE) make this possible, and usability is critical. Data owners can grant or revoke access to their encrypted data without having to worry about the complexities of encryption and key management. - Mark Pryor, The Seam

3. Secure Multiparty Computation

This is an exciting time for cryptography. Secure multiparty computation (MPC) replaces dated hardware, realizing operational agility and cost-effectiveness, and is a natural fit for cloud. MPC eliminates single points of failure, and is synergetic with cybersecurity technologies, improving authentication, insider threat mitigation and key management while driving innovation. - Yehuda Lindell, Unbound Tech

4. More Customization And Smarter Solutions

The next big thing in cybersecurity will be responsive and predictive technologies underpinning sector-specific, real-time defense systems. There will be a shift away from reliance on one-size-fits-all security services and toward more intelligent and informative cybersecurity solutions customized to better engage, protect and serve particular industry ecosystems. - Charles Aunger, Health2047 - American Medical Association

5. Artificial Intelligence-Powered Cybersecurity

Expect more AI in cybersecurity, from both the perspective of a hacker and from those trying to defend against attacks. Hackers will be able to infuse AI-powered malware, for example, to infiltrate networks and stay dormant until it finds the optimal time to deploy its payload. On the flipside, security tools will use AI to identify anomalies that we may currently miss. Who will win is anyone's guess. - Jason Lau, Crypto.com

6. A Predictive Model To Eliminate Threat Vectors

The current generation of cyber protection using a solution that is divorced from the asset it is protecting so as to catch the predator in advance of the actual attack on protected resources will continue to be enhanced. Next big thing? The introduction of artificial intelligence and data analytics into a predictive model that will determine threat vectors and shut them down before they even start. - Jerry Nelson, Beyond Impact

7. Increased Focus On Physical Security

The vast majority of breaches and hacks come from a failure to maintain good physical security. The bottom line is most hacks start with someone gaining access to credentials. A hacker will not be successful if you do not invite them in. Want to lessen your risk of intrusion? Get streetwise and make sure your staff is, as well. "Think before you click" is my motto. - Wayne Lonstein, VFT Solutions, Inc.

8. Integration Of Self-Contained Tools

There is a difference between cybersecurity and digital privacy. But the two need each other to survive. I think the next big thing will be twofold. First, expect tools for communication that are self-containedwith privacy built into be integrated, so that information is kept within company borders. The second will be the growing use of AI to fight the other AI that is and will be most responsible for data breaches. - Gran Wgstrm, Idka

9. Use Of Data Access Security Brokers

Organizations need to separate the tools used for access control, such as encryption, from the data by inserting a layer between them that functions as a data access security broker. The broker will validate policies and authorize the user/application/device/etc. to access the data, allowing organizations to retain control of data at all times. - Jeff Capone, SecureCircle

10. Enhanced Third-Party Vetting

At the heart of many breaches plaguing the news are missteps by third parties many organizations work with. Third parties must be vetted carefully by an organization for their risk management and data protection policies and procedures, because they also use third parties, quickly spreading the original organizations data into a complex spider web. Proper vetting procedures can mitigate such a risk. - Matt Kunkel, LogicGate

11. Quantum-Resistant Cryptography

Encrypting information so only those who should be able to access it can, has been extremely valuable for cybersecurity, but with the coming of quantum computing, brute force attacks could become much more efficient and speedy. Quantum-resistant cryptography (QRC) algorithms will become increasingly important as quantum computing becomes more mainstream in the future with cyber adversaries. - Michael Xie, Fortinet

12. Homomorphic Security

In the ERP world, we battle cybersecurity every day, and for our massive amounts of data in our client databases we want to have an avenue that is more secure than in any other manner. As such, we have begun working with homomorphic encryption. No key stores anywhere for the data inside a database and no key stores for thieves to use data in any way. It would take two trillion years to break - Christopher Carter, Approyo

13. Increased Transparency And Ease Of Use

The biggest problem I've had in my 20-year cybersecurity career is that users will find ways to circumvent security technology if it makes their respective jobs more tedious. Continuous training with the right tech leads to greater adoption and security. For example, with proper training, the Corcoran $400k email scam would've been prevented. - Tim Maliyil, AlertBoot

14. More Focus On Protecting People

The next era of cybersecurity will focus on protecting people, not just the networks and devices they use. Today, most data breaches are caused by human error. Businesses need a people-centric approach to cybersecurity, layering awareness training with advanced machine learning technology to understand human behavior online, and predict and prevent incidents of human error before they happen. - Edward Bishop, Tessian

Read this article:
14 Tech Pros Predict The 'Next Big Thing' In Cybersecurity And Encryption - Forbes