Archive for the ‘Quantum Computer’ Category

How Zapata and Andretti Motorsport Will Use Quantum Computing to Gain an Edge at the Indianapolis 500 – Quantum Computing Report

You might think that auto racing would not be a good application for quantum computing because the teams consist of grease monkeys who may know auto mechanics but wouldnt know how to leverage advanced computing. But you would be wrong.

Auto racing is a big business where there can be a very thin line between success and failure. To give you an idea of how small things can make a big difference you can look at the results of the 2015 Indianapolis 500. In that race, the difference in finishing time between first place finisher Juan Pablo Montoya and second place finisher Will Power was 104.6 milliseconds. And those 104.6 millisecond made the difference between winning a first-place prize of $2.44 million or not.

It turns out that an auto race generates a lot of data, about 1 Terabyte per car in a typical race, that if analyzed and used wisely can help give a racing team a critical edge. To that end, Zapata Computing and Andretti Motorsports formed a partnership earlier this year to work together on race analytics and see how they could use Zapatas advanced analytics, quantum techniques, and Orquestra hybrid classical/quantum data and workflow manager to win more races.

Although this work between the two companies has just started, a big event for both companies will occur this weekend with the 2022 Indianapolis 500 race. We talked with Chris Savoie, CEO of Zapata Computing, and he described three of the first use cases where they believe advanced analytics, machine learning, and quantum computing can potentially make a difference.

Tire Degradation Analysis

When you have a car going at over 200 MPH, the tires wear out very quickly. In a typical Indianapolis 500 race, the tires can be changed 5 or more times and require time wasting pit stops to accomplish. Whats more the tires have different characteristics when they are just put on and when they have been used a while. So, the racing manager has a lot of strategic variable juggle. When should the car be called in for a pit stop to change the tires, which set of tires should they put on the car, and how many tire changes should they have, and what is the current weather and track conditions? For a data analyst, this is a large optimization problem and will be one of the first areas that Zapata will work on with Andretti to create a ML model that can help guide these decisions using data collected in previous race sessions as well as data collected in real time during the race.

Fuel Savings Opportunities

Cars need to be refueled during the race. In addition, the driver has some control over the fuel consumption by the way he drives. If a racing team can find a way to minimize the number of refuelings and avoid a pit stop, it can save a lot of time. Whats more you dont want to cross the finish line with a full tank because they would be a waste. In the 2016 race, driver Alexander Rossi took a gamble and decided not to go for a final pit stop to refuel with 33 laps to go. It turns out he ran out of gas at the very end and coasted across the finish line. But he won the race because the second-place guy did decide to refuel and the extra pit stop time cost him the race. So, finding ways to improve fuel consumption and determine the best timing for refueling also turns out to be an optimization problem that may an opportunity to use machine learning and advanced analytics to find the best solution and improve race performance.

Yellow Flag Predictive Modelling

A yellow flag during the race occurs when an accident occurs or there is debris on the track. Drivers are required to reduce their speed and passing another car is prohibited. One of the impacts of this, is that the relative lead of one car over another is reduce. But it may also be a good time to go in for a pit stop since the cars arent going at full speed while the flag is on. If a racing team had a crystal ball and could predict when a yellow flag would occur, it could help them determine their best pit stop strategy. This may seem a little far-fetched but the Zapata/Andretti team will attempt to create a model for this that will be based upon conditions on the track, the status of the various cars in the cars, which particular drivers are in those cars, and other factors collected during the race. It will be interesting to us to see if they can actually create a useful model for when yellow flags may occur from this data.

From an operations standpoint, working in this environment can present some unique challenges. But it also provides learning opportunities for the Zapata team as they face real world challenges and find ways to solve them that can be used for future product enhancements and customer engagements in other areas. One of the first things to understand is the racing environment requires real time decisions and you do not want to use a quantum computer somewhere in the cloud on race day. The latencies will be too slow and you dont want to have to struggle with flaky Wi-Fi connections. So, Zapata and Andretti have set up an on-site Race Analytics Command Center as shown in the picture below.

Zapata and Andretti arent going to install a quantum computer in this trailer, but it will have a large amount of classical computing capability to help the team make real time decisions on race day. Machine learning applications are typically divided into a training session that develops the optimum coefficients for a model and an execution portion that just runs the model and provides an output based upon the previously setup coefficients. The training portion is the most computationally intensive portion of an ML model, they do not have to run in real time and is a good opportunity for leveraging quantum computing. Executing a model once it is created is not so computationally intensive and can be done on a classical processor. The team can feed in data from previous races and trial runs, create an ML model over many days or weeks, but then execute the ML model in real time on classical computers sitting in this trailer.

The collaboration between Zapata and Andretti goes much beyond leveraging quantum computing. The overall program will involve working with multiple data bases that could be resident with cloud providers, edge computing data coming in from various sensors, and managing workflows that are both classical and quantum in nature. Zapata will be using their Orquestra product to help manage all this.

This will be a long-term collaboration. Because the available quantum computers are not yet powerful enough to provide an advantage, the first implementations of this work will use quantum-inspired algorithms. However, the intent is that as the quantum processors become more powerful, these algorithms will eventually be moved for full quantum computers and allow the companies to create larger, more complex, and more accurate models to further their advantage. Andretti participates in many different types of auto racing and has many different teams. So, the two companies will have a lot of opportunities to try out and develop this capability. We also expect the companies will find additional use cases for leveraging advanced computing capabilities as they work together.

For additional information about this collaboration, a news release posted on the Zapata web site can be accessed here.

May 26, 2022

Read this article:
How Zapata and Andretti Motorsport Will Use Quantum Computing to Gain an Edge at the Indianapolis 500 - Quantum Computing Report

QuantWare and QuantrolOx Partner to Ease the Integration of the Control Software With the Hardware Device – Quantum Computing Report

QuantWare and QuantrolOx Partner to Ease the Integration of the Control Software With the Hardware Device

QuantWare is a company that provides QPU chips to customers who want to build up their own quantum computer. QuantrolOx is a company that provides automated machine learning based control software to provide optimum control of qubits. The two companies have announced a partnership to integrate QuantrolOx software with QuantWares hardware to create an open architecture quantum computer solution for customers who want to build their own machine. QuantWare asserts that by working with themselves and their partners a customer can create a quantum computer on their own for 1/10th the cost of purchasing a complete system from one of the hardware vendors. Additional information about this partnership can be found in a news release posted on the QuantWare website here.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Continue reading here:
QuantWare and QuantrolOx Partner to Ease the Integration of the Control Software With the Hardware Device - Quantum Computing Report

Q&A with Atos’ Eric Eppe, an HPCwire Person to Watch in 2022 – HPCwire

HPCwire presents our interview with Eric Eppe, head of portfolio & solutions, HPC & Quantum at Atos, and an HPCwire 2022 Person to Watch. In this exclusive Q&A, Eppe recounts Atos major milestones from the past year and previews whats in store for the year ahead. Exascale computing, quantum hybridization and decarbonization are focus areas for the company and having won five out of the seven EuroHPC system contracts, Atos is playing a big role in Europes sovereign technology plans. Eppe also shares his views on HPC trends whats going well and what needs to change and offers advice for the next-generation of HPC professionals.

Eric, congratulations on your selection as a 2022 HPCwire Person to Watch. Summarize the major milestones achieved last year for Atos in your division and briefly outline your HPC/AI/quantum agenda for 2022.

2021 was a strong year for Atos Big Data and Security teams, despite the pandemic. Atos BullSequana XH2000 was in its third year and was already exceeding all sales expectations. More than 100,000 top bin AMD CPUs were sold on this platform, and it made one of the first entries for AMD Epyc in the Top500.

We have not only won five out of seven EuroHPC petascale projects, but also delivered some of the most significant HPC systems. For example, we delivered one of largest climate studies and weather forecast systems in the world to the European Centre for Medium-Range Weather Forecasts (ECMWF). In addition, Atos delivered a full BullSequana XH2000 cluster to the German climate research center (DKRZ). 2021 was also the launch of Atos ThinkAI and the delivery of a number of very large AI systems such as WASP in Sweden.

2022 is the year in which we are preparing the future with our next-gen Atos BullSequana XH3000 supercomputer, a hybrid computing platform bringing together flexibility, performance and energy-efficiency. Announced recently in Paris, this goes along with the work that has started on hybrid computing frameworks to integrate AI and quantum accelerations with supercomputing workflows.

Sovereignty and sustainability were key themes at Atos launch of its exascale supercomputing architecture, the BullSequana XH3000. Please address in a couple paragraphs how Atos views these areas and why they are important.

This was a key point I mentioned during the supercomputers reveal. For Europe, the real question is should we indefinitely rely on foreign technologies to find new vaccines, develop autonomous electric vehicles, and find strategies to face climate changes?

The paradox is that Europe leads the semiconductor substrate and manufacturing markets (with Soitec and ASML) but has no European foundry in the <10nm class yet. It is participating in the European Processor Initiative (EPI) and will implement SiPearl technologies in the BullSequana XH3000, but it will take time to mature enough and replace other technologies.

Atos has built a full HPC business in less than 15 years, becoming number one in Europe and in the top four worldwide in the supercomputer segment, with its entire production localized in its French factory. We are heavily involved in all projects that are improving European sovereignty.

EU authorities are today standing a bit behind compared to how the USA and China regulations are managing large petascale or exascale procurements, as well as the difference between how funding flows to local companies developing HPC technologies. This is a major topic.

Atos has developed a significant amount of IP, ranging from supercomputing platforms, low latency networks, cooling technologies, software and AI, security and large manufacturing capabilities in France with sustainability and sovereignty as a guideline. We are partnering with a number of European companies, such as SiPearl, IQM, Pasqal, AQT, Graphcore, ARM, OVH and many labs, to continue building this European Sovereignty.

Atos has announced its intention to develop and support quantum accelerators. What is Atos quantum computing strategy?

Atos has taken a hardware-agnostic approach in crafting quantum-powered supercomputers and enabling end-user applications. Atos ambition is to be a major player in multiple domains amongst which are quantum programming and simulation, the next-generation quantum-powered supercomputers, consulting services, and of course, quantum-safe cybersecurity.Atos launched the Atos Quantum Learning Machine (QLM) in 2017, a quantum appliance emulating almost all target quantum processing units with abstractions to connect to real quantum computing hardware when available. We have been very successful with the QLM in large academics or research centers on all continents. In 2021, there was a shift of many commercial companies starting to work on real use cases, and the QLM is the best platform to start these projects without waiting for hardware to be available at scale.

Atos plays a central role in European-funded quantum computing projects. We are cooperating with NISC QPU makers to develop new technologies and increase their effectiveness in a hybrid computing scenario. This includes, but is not limited to, hybrid frameworks, containerization, parallelization, VQE, GPU usage and more.

Where do you see HPC headed? What trends and in particular emerging trends do you find most notable? Any areas you are concerned about, or identify as in need of more attention/investment?

As for upcoming trends in the world of supercomputing, I see a few low-noise trends. Some technological barriers that may trigger drastic changes, and some arising technologies that may have large impacts on how we do HPC in the future. Most players, and Atos more specifically, are looking into quantum hybridization and decarbonization which will open many doors in the near future.

Up to this point, HPC environment has been quite conservative. I believe that administrators are starting to see the benefits of orchestration and micro service-based cluster management. There are some obstacles, but I do see more merits than issues in containerizing and orchestrating HPC workloads. There are some rising technological barriers that may push our industry in a corner, while at the same time giving us opportunities to change the way we architect our systems.

High performance low latency networks are making massive use of copper cables. With higher data rates (400Gb/s in 2022 and 800Gb/s in 2025) the workable copper cable length will be divided by 4x, replaced by active or fiber cables with cabling costs certainly increasing by 5 or 6x. This is clearly an obstacle to systems that are going to range in the 25,000 endpoints, with a cabling budget in tens of millions.

This very simple problem may impose a paradigm shift in the way devices, from a general standpoint, are connected and communicate together. This triggers deeper architectural design points changes from racks to nodes and down to elements that are deeply integrated today such as compute cores, buses, memory and associated controllers, and switches. I wont say the 800Gb/s step alone will change everything, but the maturity of some technologies, such as silicon photonics and the emerging standardization on very powerful protocols like CXL, will enable a lot more flexibility while continuing to push the limits. Also, note that CXL is just in its infancy, but already shows promise for a memory coherent space between heterogenous devices, centralized or distributed, mono or multi-tenant memory pools.

Silicon photonic integrated circuits (PICs), because they offer theoretically Tb/s bandwidth through native fiber connection, should allow a real disaggregation between devices that are today very tightly connected together on more complex and more expensive than ever PCBs.

What will be possible inside a node will be possible outside of it, blurring the traditional frontier between a node, a blade, a rack and a supercomputer, offering a world of possibilities and new architectures.

The market is probably not fully interested in finding an alternative to the ultra-dominance of the Linpack or its impact on how we imagine, engineer, size and deliver our supercomputers. Ultimately, how relevant is its associated ranking to real life problems? I wish we could initiate a trend that ranks global system efficiency versus available peak power. This would help HPC players to consider working on all optimization paths rather than piling more and more compute power.

Lastly, I am concerned by the fact that almost nothing has changed in the last 30 years in how applications are interacting with data. Well, HPC certainly uses faster devices. We now have clustered shared file systems like Lustre. Also, we have invented object-oriented key and value abstractions, but in reality storage subsystems are most of the time centralized. They are connected on the high-speed fabric. They are also oversized to absorb checkpoints from an ever-growing node count, while in nominal regime they only use a portion of the available bandwidth. Ultimately with workloads, by nature spread across all fabric, most of the power consumption comes from IOs.

However, its time to change this situation. There are some possible avenues, and they will improve as a side effect, the global efficiency of HPC workloads, hence the sustainability and the value of HPC solutions.

More generally, what excites you about working in high-performance computing?

Ive always loved to learn and be intellectually stimulated, especially in my career environment. High performance computing, along with AI and now quantum, are giving me constant food for thoughts and options to solve big problems than I will ever been able to absorb.

I appreciate pushing the limits every day, driving the Atos portfolio and setting the directions, ultimately helping our customers to solve their toughest problems. This is really rewarding for me and our Atos team. Im never satisfied, but Im very proud of what we have achieved together, bringing Atos into the top four ranking worldwide in supercomputers.

What led you to pursue a career in the computing field and what are your suggestions for engaging the next generation of IT professionals?

Ive always been interested by technology, initially attracted by everything that either flew or sailed. Really, Im summarizing this into everything that plays with wind. In my teenage years, after experiencing sailboards and gliders, I was fortunate enough to have access to my first computer in late 1979 when I was 16. My field of vision prevented me from being a commercial pilot, thus I started pursuing a software engineering master degree that led me into the information technology world.

When I began my career in IT, I was not planning any specific path to a specific domain. I simply took all opportunities to learn a new domain, work hard to succeed, and jump to something new that excited me. In my first position, I was lucky enough to work on an IBM mainframe doing CAD with some software development, as well as embracing a fully unknown system engineering role that I had to learn from scratch. Very educational! I jumped from developing in Fortran and doing system engineering on VM/SP and Unix. Then I learned Oracle RDMBS and Internet at Intergraph, HPC servers and storage at SGI. I pursued my own startups, and now Im leading the HPC, AI and quantum portfolio at Atos.

What I would tell the next generation of IT professional for their career is to:

First, only take roles in which you will learn new things. It could be managerial, financial, technical it doesnt matter. To evolve in your future career, the more diverse experience you have, the better you will be able to react and be effective. Move to another role when you are not learning anymore or if you are far too long in your comfort zone.

Second, look at problems to solve, think out of the box and with a 360-degree vision. Break the barriers, and change the angle of view to give new perspectives and solutions to your management and customers.

Also, compensation is important, but its not all. What you will do, how it will make you happy in your life, and what you will achieve professionally is more important. Ultimately, compare your salary with the free time that remains to spend it with your family and friends. Lastly, compensation is not always an indicator of success, but rather changing the world for the better and making our planet a better place to live is the most important benefit you will find in high performance computing.

Outside of the professional sphere, what can you tell us about yourself family stories, unique hobbies, favorite places, etc.? Is there anything about you your colleagues might be surprised to learn?

Together with my wife, we are the proud parents of two beautiful adult daughters. Also we have our three-year-old, bombshell Jack Russell named Pepsy, who brings a lot of energy to our house.

We live Northwest of Paris in a small city on the Seine river. Im still a private pilot and still cruising sail boats with family and friends. I recently participated in the ARC 2021 transatlantic race with three friends on a trimaran boat a real challenge and a great experience. Soon, were off to visiting Scotland for a family vacation!

Eppe is one of 12 HPCwire People to Watch for 2022. You can read the interviews with the other honorees at this link.

See the article here:
Q&A with Atos' Eric Eppe, an HPCwire Person to Watch in 2022 - HPCwire

Encryption: How It Works, Types, and the Quantum Future | eSP – eSecurity Planet

Encryption and the development of cryptography have been a cornerstone of IT security for decades and remain critical for data protection against evolving threats.

While cryptology is thousands of years old, modern cryptography took off in the 1970s with the help of the Diffie-Hellman-Merkle and RSA encryption algorithms. As networks evolved and organizations adopted internet communications for critical business processes, these cryptographic systems became essential for protecting data.

Through public and commercial development of advanced encryption methods, organizations from sensitive government agencies to enterprise companies can ensure protected communications between personnel, devices, and global offices. Financial institutions in the 1990s and 2000s were some of the first to incorporate encryption to protect online transactions, particularly as backup tapes were lost in transit.

The race continues for cryptographers to keep encryption systems ahead of cryptanalysts and hackers. Quantum computing attacks already present a real threat to existing standards, making the continued development of encryption pivotal for years to come.

This article looks at encryption, how it fits into cryptology, how cryptographic algorithms work, types, use cases, and more.

Encryption is the act of translating data into secret code (ciphertext) and back again (plaintext) for secure access between multiple parties. With shared protocols and encryption algorithms, users can encode files or messages only accessible to other select clients.

To no ones surprise, the study of cryptography and advancements in encryption are essential to developing cybersecurity. Individuals, small businesses, and enterprise organizations all rely on encryption to securely store and transfer sensitive data across wide-area networks (WAN) like the internet.

Application developers managing sensitive user data must especially beware of increasing regulatory action surrounding data privacy.

Cryptology is the overarching field of study related to writing and solving codes, whereas encryption and decryption are the central processes driving the computer science discipline.

As seen below, cryptography is the methodology and applications for managing encryption schemes, and cryptanalysis is the methodology of testing and decrypting these messages.

Cryptographers versed in the latest encryption methods help cybersecurity companies, software developers, and national security agencies secure assets. Cryptanalysts are the individuals and groups responsible for breaking encryption algorithms for good, bad, and ugly reasons.

Penetration testing and red teamers are critical for remaining vigilant in an ever-changing threat environment and catching the vulnerabilities otherwise missed. Alternatively, advanced persistent threats (APT) are always around the corner trying to do the same.

While there are several encryption schemes, they all share the ability to encrypt and decrypt data through a cryptographic key. This unique key is a random string specifically produced to complete the encryption transaction and the more bits in length and complex a process, the better.

Brute force attacks are among the most common cryptanalytic methods, and the time it takes to break an encrypted message is a recognized indicator of the encryption strength.

For users familiar with password management and the value of complex passwords, this makes sense. The longer and more complex the encrypted message is, the longer itll take to decrypt.

Without encryption, data from users and organizations alike would be widely available for all to see on public networks. Individuals and application developers hold responsibility for using and implementing services secured by a good encryption algorithm.

Not every application or network requires military-grade encryption however, enterprise organizations cant go wrong with the services offering the most strength.

A visible example of the role encryption plays with everyday web traffic is the transition from HTTP to HTTPS protocols witnessed in the last decade. Short for the Hypertext Transfer Protocol, HTTP was central to the World Wide Web development in the 1990s and remains a popular application layer protocol connecting users to internet content through a web browser.

In 1994, Secure Sockets Layer (SSL) emerged to give clients an encrypted method to surf the web. By 1999, its successor the Transport Layer Security (TLS) protocol offered a more robust cryptographic protocol across technical components like cipher suites, record protocol, message authentication, and handshake process. HTTP over SSL or HTTP over TLS, dubbed HTTPS, wasnt immediately adopted by the masses.

Thanks to an industry campaign led by the Electronic Frontier Foundation (EFF) for users, website owners, and hosting providers to prioritize secure web traffic, HTTPS has overcome its less secure older sibling. In 2016, only 40% of websites protected their web pages and visiting users with HTTPS. Five years later, that number is more than 90% of websites, protecting users en masse from web attacks.

Before computer science, two individuals could use an identical key to unlock a shared mailbox or gate. Today, symmetric encryption via block ciphers or stream ciphers works much the same way, offering two or more users the ability to encrypt and decrypt messages with a single, shared key between stakeholders.

Users can establish a symmetric key to share private messages through a secure channel like a password manager. Unfortunately, while symmetric encryption is a faster method, it also is less secure.

Symmetric models rely on the integrity of the private key, and sharing it in plaintext over text or email leaves users vulnerable. Phishing and social engineering are common ways threat actors can obtain a symmetric key, but cryptanalysis and brute force attempts can also break symmetric key ciphers.

In the 1970s, the demand for more secure cryptographic systems was met with computer scientists from Stanford and MIT developing the first examples of asymmetric encryption.

Unlike symmetric cryptography, asymmetric encryption is a complex mathematical process in which two users exchange public and private components to create a shared, unique key. Though more complicated and expensive to implement, asymmetric encryption uses thousands of bits and a robust key generation process to ensure secure communications over distributed networks.

Software developers and organizations increasingly use symmetric and asymmetric encryption methods to give users speed and security in communication.

Also known as hybrid encryption, the bundle of the two methods usually starts with a handshake between users through asymmetric cryptography to establish security. Within the asymmetric connection, parties then use symmetric algorithms for the faster processing of messages.

Cryptography challenges have been met by leading computer scientists, universities, and national security and intelligence agencies. The below section looks at the most substantial standards in the evolution of encryption.

The need for a government-wide standard to encrypt sensitive information was evident in 1973, when the U.S. National Bureau of Standards, nowadays the NIST, made a public request for potential ciphers. The algorithm dubbed the Data Encryption Standard (DES) was developed and proposed by IBM and lead cryptographer Horst Feistel.

By the 1990s, DES received wide criticism for its vulnerability to brute force attacks and its short key size. Triple DES, wherein the DES cipher algorithm ran over data blocks three times, proved to be more secure but insufficient for the online ecosystem and universe of data coming.

Shortly after the release of DES, three computer scientists Whitfield Diffie, Martin Hellman, and Ralph Merkle published their research on public-private key cryptography in 1976. As it came to be known, the Diffie-Hellman-Merkle (DHM) key exchange set a precedent for asymmetric encryption before the global networking boom.

Unlike symmetric encryption methods, which previously used few bits, the DHM key exchange provided for encryption supporting key lengths of 2,048 bits to 4,096 bits.

A year after DHMs findings, three cryptographers Ron Rivest, Adi Shamir, and Leonard Adleman developed the RSA public-key cryptosystem.

The three innovators and MIT patented the RSA algorithm, a proprietary system available through RSA Security until its public release in 2000. Standing the test of time, the RSA algorithm remains the most popular public key cryptographic system today and introduced the concept of digital signatures for authentication.

In 1997, the NIST renewed its call to the public cryptography community for the successor to DES. Two Dutch cryptographers Joan Daemen and Vincent Rijmen submitted the eventual pick known as Rijndael. By 2001, the NIST dubbed it the Advanced Encryption Standard (AES) and officially replaced the use of DES.

AES offered larger and different key sizes with a family of ciphers to choose from and remains one of the most popular standards over 20 years later.

While both DES and AES use symmetric block ciphers, AES uses a substitution-permutation network wherein plaintext goes through multiple rounds of substitution (S-box) and permutation (P-box) before finalizing the ciphertext block. Similarly, a client or application can decrypt the AES message by reversing these S-box and P-box transformations.

Professors at the University of Washington and Columbia University independently published research in 1985 on elliptic curve cryptography (ECC), but it didnt come into widespread implementation until the mid-2000s.

Like RSA, ECC is an encryption algorithm for public key cryptography, but instead of prime numbers for generating key pairs, ECC uses elliptic curves. ECC is faster than RSA with a smaller key size while maintaining its security with the mathematics behind elliptic curves over finite fields.

ECC has proven to be a popular choice for web applications, blockchains, and mobile devices as a fast, lightweight yet secure alternative to RSA. ECC isnt immune to compromise, including threats like twist-security and side-channel attacks.

In 1978, Rivest and Adelman published additional research on a cryptographic method dubbed homomorphic encryption. However, it wasnt until 2009 that a graduate student published research on fully homomorphic encryption (FHE) and set off an exploration period.

Unlike conventional cryptography, homomorphic encryption allows for a set of limited operations on ciphertext without decrypting the message. Homomorphic models includes partial homomorphic (PHE) for a single operation, somewhat homomorphic (SHE) for two functions, and FHE for the broadest operational control over encrypted data.

More than a decade later, companies like Google, IBM, and Microsoft continue to explore FHE capabilities where an organization can process specific data within an encrypted message while maintaining the integrity of the data. FHE remains a maturing cryptographic system with little evidence to date of widespread adoption.

Based on quantum mechanics rather than mathematical operations, quantum computers utilizing Shors algorithm for finding prime factors can break asymmetric standards like DHM, RSA, and ECC within moments.

Post-quantum cryptography (PQC) describes the budding market working to address quantum attacks and secure the next generation of IT environments and data. Agencies like the NIST and NSA continue to release security guidelines against quantum threats, but theres still much to learn of quantum information science (QIS) and no official US standard.

Earlier this month, the White House released a national security memo outlining U.S. administrative objectives for adopting quantum-resistant cryptography. While initial standards are expected by 2024, a full mitigation architecture for federal agencies isnt expected until 2035.

The most common applications for cryptographic systems in IT environments include:

Cryptology long predates todays encryption algorithms for data stored in our pockets and moving across the web. From Julius Caesar to the Enigma code, cryptographic methods continue to become more complex to the benefit and detriment of various actors.

As cryptanalysts and threat actors poke holes in the latest implementations, its natural for the industry and users to upgrade to the most robust available algorithms. Outdated and inadequate cryptographic standards leave organizations and users vulnerable, giving those persistent or capable enough the ability to extract, sell, or ransom sensitive data.

The emergence of post-quantum cryptography is a reality stakeholders must grapple with sooner than later.

Read more:
Encryption: How It Works, Types, and the Quantum Future | eSP - eSecurity Planet

COMPUTEX 2022 Returns to In-Person With Virtual and Physical Exhibition – HPCwire

TAIPEI,May 24, 2022 Today marked the opening of COMPUTEX 2022, held until May 27at Taipei Nangang Exhibition Center, Hall 1. Among the distinguished guests in attendance at the opening ceremony to witness the rapid development of new digital technology were PresidentTsai Ing-Wen, Minister of Economic AffairsWang Mei-Hua, Chairman of Taiwan External Trade Development Council (TAITRA)James Huang, and Chairman of Taipei Computer Association Paul Peng.

PresidentTsai Ing-wen stated, COMPUTEX is an important platform for the global technology industry, which not only enables Taiwanese companies to strengthen their international collaboration and connect to the global market, but also shows the capabilities of TaiwansICT industry to the world. In the future, the development of advanced technologies such as AI, quantum computers, and cloud computing will be highly dependent on chips. Therefore,Taiwanwill leverage its strengths in high-end hardware manufacturing and empowering ICT innovations in various industries to make the overall economy more competitive. Also, we will actively work together with enterprises to accelerate the digital transformation process and to build the next golden decade ofTaiwanstechnology industry.

Over the past twenty years, technology, our shared global language, has empowered the world and resulted in important milestones. Even when facing urgent challenges such as the pandemic and supply chain disruptions, technology has allowed infinite possibilities, said TAITRA ChairmanJames Huang. COMPUTEXs mission has always been to introduce technologies to the world and help make a difference, and this years event offers an upgraded, hybrid exhibition experience. We look forward to stimulating technological innovation and heading into the future with global technology companies.

The leading global ICT companies showcase their innovative technologies and solutions at COMPUTEX. GIGABYTE showcased high-performance computing applications, including AI, 5G, edge computing, intelligent traffic management, security, and gaming and entertainment. Delta Electronics chose to focus on sustainability and presented energy and thermal management solutions for applications such as industrial automation, data center infrastructure, and EV charging. KIOXIA displayed its XG8 series of client SSDs for high-end notebooks, desktops, and workstations. Furthermore, Garage+ Pavilion selected 48 startups to showcase innovative capabilities in numerous fields, including AI, IoT, health care, and green technology.

COMPUTEX 2022 Provides an Overview of the Global Technology Ecosystems

COMPUTEX 2022 features six main themes: Accelerating Intelligence, Connected X-Experience, Digital Resilience, Innovative Computing, Innovations & Startups, and Sustainability. In addition, a virtual exhibition, COMPUTEX DigitalGO, is held from today toJune 6. By making use of diverse channels, COMPUTEX 2022 has created an interactive platform for global engagement and provided a comprehensive overview of the future developments in the global technology ecosystems.

In addition to the comprehensive exhibition, COMPUTEX also offers keynote speeches and forums. This years CEO Keynotes, Advanced Micro Devices, Inc. (AMD), NXP Semiconductors (NXP), Micron Technology, and Supermicro will share their corporate visions from a technology perspective. Microsoft and NVIDIA will also give keynote speeches, streaming live onCOMPUTEXs Youtube channel.

The COMPUTEX Forum will be held onMay 26at Taipei Nangang Exhibition Center, Hall 1, Section J. In the morning, in the first session titled Technology Empowerment, Texas Instruments, Ericsson, NXP, NVIDIA, and Micron Technology will discuss how global technology giants find partners, achieve new advancements, and embrace change.

In the afternoon, in the second session, Delta Electronics will talk on Unceasing Innovation for a Net Zero Future and demystify how businesses are leveraging digital technology to achieve sustainability and reach the 2050 net-zero carbon emissions target. Finally, in the third and final session themed Application Advancements, HTC, IBM, Dassault System, and Nokia Taiwan will discuss the metaverse and how businesses can actively deploy smart living and successfully create new work modes.

Furthermore, Live Studio, a new addition to this years event, will serve as the official news channel for COMPUTEX 2022 and provide participants with the most up-to-date and complete event coverage throughout the show. The Guided Tours are another highlight of the event. Industry KOLs will personally lead the tours, take fans around the booths, and put a brand new spin on technology discovery. In addition, media outlets, including Embedded Computing Design from the US, Dempa Publications fromJapan, and IT Chosun fromSouth Korea, will cover COMPUTEX, showingTaiwansscientific and technological achievements and potential to the world.

This year, COMPUTEX 2022 is being held at Taipei Nangang Exhibition Center, Hall 1. In addition to technology trend sharing, industry application demonstrations, and the fun and interactive live studio and guided tours, there are photo booths for each of the six themes. Participants who take photos in each booth and upload the photos will be entered to win the event organizers limited edition COMPUTEX 2022 NFT. With so many exciting activities to enjoy, COMPUTEX 2022 is an event not to be missed.

To learn more aboutCOMPUTEX, please visit: https://www.computextaipei.com.tw.

About COMPUTEX

COMPUTEX was founded in 1981. It has grown with the global ICT industry and become stronger over the last four decades. Bearing witness to historical moments in the development of and changes in the industry, COMPUTEX attracts more than 40,000 buyers to visitTaiwanevery year. It is also the preferred platform chosen by top international companies for launching epoch-making products.

Taiwanhas a comprehensive global ICT industry chain. Gaining a foothold inTaiwan, COMPUTEX is jointly held by the Taiwan External Trade Development Council and Taipei Computer Association, aiming to build a global tech ecosystem. COMPUTEX uses cross-domain integration and innovation services as the most powerful driving forces for achieving the goal of becoming a new platform for global technological resources.

About TAITRA

The Taiwan External Trade Development Council (TAITRA) isTaiwansforemost trade promotion organization. TAITRA is a public-benefit corporation founded by the Ministry of Economic Affairs by uniting industry and commerce groups from the private sector with the purpose of helping them expand their global reach. Currently, TAITRA has a team of more than 1,300 trade professionals, both domestically and abroad. Headquartered inTaipei, TAITRA operates 5 local offices in Taoyuan, Hsinchu, Taichung, Tainan, and Kaohsiung, as well as 63 branches worldwide. It has also signed cooperation agreements with 319 sister organizations that promote international trade. By forming a comprehensive trade services network that provides zero-time-difference and borderless real-time services, TAITRA continues to work with enterprises to jointly pursue the steady development ofTaiwanseconomy. It is the best partner for your success in business expansion.

Source: COMPUTEX

Read the original:
COMPUTEX 2022 Returns to In-Person With Virtual and Physical Exhibition - HPCwire