Encryption: How It Works, Types, and the Quantum Future | eSP – eSecurity Planet

Encryption and the development of cryptography have been a cornerstone of IT security for decades and remain critical for data protection against evolving threats.

While cryptology is thousands of years old, modern cryptography took off in the 1970s with the help of the Diffie-Hellman-Merkle and RSA encryption algorithms. As networks evolved and organizations adopted internet communications for critical business processes, these cryptographic systems became essential for protecting data.

Through public and commercial development of advanced encryption methods, organizations from sensitive government agencies to enterprise companies can ensure protected communications between personnel, devices, and global offices. Financial institutions in the 1990s and 2000s were some of the first to incorporate encryption to protect online transactions, particularly as backup tapes were lost in transit.

The race continues for cryptographers to keep encryption systems ahead of cryptanalysts and hackers. Quantum computing attacks already present a real threat to existing standards, making the continued development of encryption pivotal for years to come.

This article looks at encryption, how it fits into cryptology, how cryptographic algorithms work, types, use cases, and more.

Encryption is the act of translating data into secret code (ciphertext) and back again (plaintext) for secure access between multiple parties. With shared protocols and encryption algorithms, users can encode files or messages only accessible to other select clients.

To no ones surprise, the study of cryptography and advancements in encryption are essential to developing cybersecurity. Individuals, small businesses, and enterprise organizations all rely on encryption to securely store and transfer sensitive data across wide-area networks (WAN) like the internet.

Application developers managing sensitive user data must especially beware of increasing regulatory action surrounding data privacy.

Cryptology is the overarching field of study related to writing and solving codes, whereas encryption and decryption are the central processes driving the computer science discipline.

As seen below, cryptography is the methodology and applications for managing encryption schemes, and cryptanalysis is the methodology of testing and decrypting these messages.

Cryptographers versed in the latest encryption methods help cybersecurity companies, software developers, and national security agencies secure assets. Cryptanalysts are the individuals and groups responsible for breaking encryption algorithms for good, bad, and ugly reasons.

Penetration testing and red teamers are critical for remaining vigilant in an ever-changing threat environment and catching the vulnerabilities otherwise missed. Alternatively, advanced persistent threats (APT) are always around the corner trying to do the same.

While there are several encryption schemes, they all share the ability to encrypt and decrypt data through a cryptographic key. This unique key is a random string specifically produced to complete the encryption transaction and the more bits in length and complex a process, the better.

Brute force attacks are among the most common cryptanalytic methods, and the time it takes to break an encrypted message is a recognized indicator of the encryption strength.

For users familiar with password management and the value of complex passwords, this makes sense. The longer and more complex the encrypted message is, the longer itll take to decrypt.

Without encryption, data from users and organizations alike would be widely available for all to see on public networks. Individuals and application developers hold responsibility for using and implementing services secured by a good encryption algorithm.

Not every application or network requires military-grade encryption however, enterprise organizations cant go wrong with the services offering the most strength.

A visible example of the role encryption plays with everyday web traffic is the transition from HTTP to HTTPS protocols witnessed in the last decade. Short for the Hypertext Transfer Protocol, HTTP was central to the World Wide Web development in the 1990s and remains a popular application layer protocol connecting users to internet content through a web browser.

In 1994, Secure Sockets Layer (SSL) emerged to give clients an encrypted method to surf the web. By 1999, its successor the Transport Layer Security (TLS) protocol offered a more robust cryptographic protocol across technical components like cipher suites, record protocol, message authentication, and handshake process. HTTP over SSL or HTTP over TLS, dubbed HTTPS, wasnt immediately adopted by the masses.

Thanks to an industry campaign led by the Electronic Frontier Foundation (EFF) for users, website owners, and hosting providers to prioritize secure web traffic, HTTPS has overcome its less secure older sibling. In 2016, only 40% of websites protected their web pages and visiting users with HTTPS. Five years later, that number is more than 90% of websites, protecting users en masse from web attacks.

Before computer science, two individuals could use an identical key to unlock a shared mailbox or gate. Today, symmetric encryption via block ciphers or stream ciphers works much the same way, offering two or more users the ability to encrypt and decrypt messages with a single, shared key between stakeholders.

Users can establish a symmetric key to share private messages through a secure channel like a password manager. Unfortunately, while symmetric encryption is a faster method, it also is less secure.

Symmetric models rely on the integrity of the private key, and sharing it in plaintext over text or email leaves users vulnerable. Phishing and social engineering are common ways threat actors can obtain a symmetric key, but cryptanalysis and brute force attempts can also break symmetric key ciphers.

In the 1970s, the demand for more secure cryptographic systems was met with computer scientists from Stanford and MIT developing the first examples of asymmetric encryption.

Unlike symmetric cryptography, asymmetric encryption is a complex mathematical process in which two users exchange public and private components to create a shared, unique key. Though more complicated and expensive to implement, asymmetric encryption uses thousands of bits and a robust key generation process to ensure secure communications over distributed networks.

Software developers and organizations increasingly use symmetric and asymmetric encryption methods to give users speed and security in communication.

Also known as hybrid encryption, the bundle of the two methods usually starts with a handshake between users through asymmetric cryptography to establish security. Within the asymmetric connection, parties then use symmetric algorithms for the faster processing of messages.

Cryptography challenges have been met by leading computer scientists, universities, and national security and intelligence agencies. The below section looks at the most substantial standards in the evolution of encryption.

The need for a government-wide standard to encrypt sensitive information was evident in 1973, when the U.S. National Bureau of Standards, nowadays the NIST, made a public request for potential ciphers. The algorithm dubbed the Data Encryption Standard (DES) was developed and proposed by IBM and lead cryptographer Horst Feistel.

By the 1990s, DES received wide criticism for its vulnerability to brute force attacks and its short key size. Triple DES, wherein the DES cipher algorithm ran over data blocks three times, proved to be more secure but insufficient for the online ecosystem and universe of data coming.

Shortly after the release of DES, three computer scientists Whitfield Diffie, Martin Hellman, and Ralph Merkle published their research on public-private key cryptography in 1976. As it came to be known, the Diffie-Hellman-Merkle (DHM) key exchange set a precedent for asymmetric encryption before the global networking boom.

Unlike symmetric encryption methods, which previously used few bits, the DHM key exchange provided for encryption supporting key lengths of 2,048 bits to 4,096 bits.

A year after DHMs findings, three cryptographers Ron Rivest, Adi Shamir, and Leonard Adleman developed the RSA public-key cryptosystem.

The three innovators and MIT patented the RSA algorithm, a proprietary system available through RSA Security until its public release in 2000. Standing the test of time, the RSA algorithm remains the most popular public key cryptographic system today and introduced the concept of digital signatures for authentication.

In 1997, the NIST renewed its call to the public cryptography community for the successor to DES. Two Dutch cryptographers Joan Daemen and Vincent Rijmen submitted the eventual pick known as Rijndael. By 2001, the NIST dubbed it the Advanced Encryption Standard (AES) and officially replaced the use of DES.

AES offered larger and different key sizes with a family of ciphers to choose from and remains one of the most popular standards over 20 years later.

While both DES and AES use symmetric block ciphers, AES uses a substitution-permutation network wherein plaintext goes through multiple rounds of substitution (S-box) and permutation (P-box) before finalizing the ciphertext block. Similarly, a client or application can decrypt the AES message by reversing these S-box and P-box transformations.

Professors at the University of Washington and Columbia University independently published research in 1985 on elliptic curve cryptography (ECC), but it didnt come into widespread implementation until the mid-2000s.

Like RSA, ECC is an encryption algorithm for public key cryptography, but instead of prime numbers for generating key pairs, ECC uses elliptic curves. ECC is faster than RSA with a smaller key size while maintaining its security with the mathematics behind elliptic curves over finite fields.

ECC has proven to be a popular choice for web applications, blockchains, and mobile devices as a fast, lightweight yet secure alternative to RSA. ECC isnt immune to compromise, including threats like twist-security and side-channel attacks.

In 1978, Rivest and Adelman published additional research on a cryptographic method dubbed homomorphic encryption. However, it wasnt until 2009 that a graduate student published research on fully homomorphic encryption (FHE) and set off an exploration period.

Unlike conventional cryptography, homomorphic encryption allows for a set of limited operations on ciphertext without decrypting the message. Homomorphic models includes partial homomorphic (PHE) for a single operation, somewhat homomorphic (SHE) for two functions, and FHE for the broadest operational control over encrypted data.

More than a decade later, companies like Google, IBM, and Microsoft continue to explore FHE capabilities where an organization can process specific data within an encrypted message while maintaining the integrity of the data. FHE remains a maturing cryptographic system with little evidence to date of widespread adoption.

Based on quantum mechanics rather than mathematical operations, quantum computers utilizing Shors algorithm for finding prime factors can break asymmetric standards like DHM, RSA, and ECC within moments.

Post-quantum cryptography (PQC) describes the budding market working to address quantum attacks and secure the next generation of IT environments and data. Agencies like the NIST and NSA continue to release security guidelines against quantum threats, but theres still much to learn of quantum information science (QIS) and no official US standard.

Earlier this month, the White House released a national security memo outlining U.S. administrative objectives for adopting quantum-resistant cryptography. While initial standards are expected by 2024, a full mitigation architecture for federal agencies isnt expected until 2035.

The most common applications for cryptographic systems in IT environments include:

Cryptology long predates todays encryption algorithms for data stored in our pockets and moving across the web. From Julius Caesar to the Enigma code, cryptographic methods continue to become more complex to the benefit and detriment of various actors.

As cryptanalysts and threat actors poke holes in the latest implementations, its natural for the industry and users to upgrade to the most robust available algorithms. Outdated and inadequate cryptographic standards leave organizations and users vulnerable, giving those persistent or capable enough the ability to extract, sell, or ransom sensitive data.

The emergence of post-quantum cryptography is a reality stakeholders must grapple with sooner than later.

Read more:
Encryption: How It Works, Types, and the Quantum Future | eSP - eSecurity Planet

Related Posts

Comments are closed.