Cryptography network security book


 

Cryptography and Network Security Principles and Practices, Fourth Edition .. No part of this book may be reproduced, in any form or by any means, without. OBJECTIVES It is the purpose of this book to provide a practical survey of both the principles and practice of cryptography and network security. In the first part of . For one-semester, undergraduate- or graduate-level courses in Cryptography, Computer Security, and Network Security. The book is suitable for self-study and .

Author:MARGARETA GUDERIAN
Language:English, Spanish, Japanese
Country:Haiti
Genre:Biography
Pages:751
Published (Last):19.02.2016
ISBN:814-6-30223-833-3
Distribution:Free* [*Registration Required]
Uploaded by: JANESSA

51964 downloads 164470 Views 15.81MB ePub Size Report


Cryptography Network Security Book

NETWORK SECURITY ESSENTIALS, FOURTH EDITION. A tutorial and survey on network security technology. The book covers important network security tools . This book elaborates the basic and advanced concepts of cryptography and network security issues. It is user friendly since each chapter is modelled with. Cryptography and Network Security book. Read 22 reviews from the world's largest community for readers. For one-semester undergraduate/graduate level cou.

Open access peer-reviewed Edited Volume Applied Cryptography and Network Security Edited by Jaydip Sen Praxis Business School Cryptography will continue to play important roles in developing of new security solutions which will be in great demand with the advent of high-speed next-generation communication systems and networks. This book discusses some of the critical security challenges faced by today's computing world and provides insights to possible mechanisms to defend against these attacks. The book contains sixteen Cryptography will continue to play important roles in developing of new security solutions which will be in great demand with the advent of high-speed next-generation communication systems and networks. The book contains sixteen chapters which deal with security and privacy issues in computing and communication networks, quantum cryptography and the evolutionary concepts of cryptography and their applications like chaos-based cryptography and DNA cryptography. It will be useful for researchers, engineers, graduate and doctoral students working in cryptography and security related areas. It will also be useful for faculty members of graduate schools and universities.

One layer will try to find features in the raw data of a picture that will help find a face, such as changes in color that will indicate an edge. The next layer might try to combine these lower layers into features like shapes, looking for round shapes inside of ovals that indicate eyes on a face. The different layers will try different features and will be compared by the evaluation function until the one that is able to give the best results is found, in a process that is only slightly more refined than trial and error.

Large data sets are essential to making this work, but that doesn't mean that more data is automatically better or that the system with the most data is automatically the best system.

Train a facial recognition algorithm on a set that contains only faces of white men, and the algorithm will have trouble with any other kind of face. Use an evaluation function that is based on historical decisions , and any past bias is learned by the algorithm. For example, mortgage loan algorithms trained on historic decisions of human loan officers have been found to implement redlining. Similarly, hiring algorithms trained on historical data manifest the same sexism as human staff often have.

Scientists are constantly learning about how to train machine learning systems, and while throwing a large amount of data and computing power at the problem can work, more subtle techniques are often more successful. All data isn't created equal, and for effective machine learning, data has to be both relevant and diverse in the right ways. Future research advances in machine learning are focused on two areas. The first is in enhancing how these systems distinguish between variations of an algorithm.

As different versions of an algorithm are run over the training data, there needs to be some way of deciding which version is "better. Getting functions that can automatically and accurately distinguish between two algorithms based on minor differences in the outputs is an art form that no amount of data can improve.

The second is in the machine learning algorithms themselves.

While much of machine learning depends on trying different variations of an algorithm on large amounts of data to see which is most successful, the initial formulation of the algorithm is still vitally important.

The way the algorithms interact, the types of variations attempted, and the mechanisms used to test and redirect the algorithms are all areas of active research. An overview of some of this work can be found here ; even trying to limit the research to 20 papers oversimplifies the work being done in the field.

None of these problems can be solved by throwing more data at the problem. Its AlphaGo computer program became a grandmaster in two steps. First, it was fed some enormous number of human-played games. Then, the game played itself an enormous number of times, improving its own play along the way. In , AlphaGo beat the grandmaster Lee Sedol four games to one. While the training data in this case, the human-played games, was valuable, even more important was the machine learning algorithm used and the function that evaluated the relative merits of different game positions.

Just one year later, DeepMind was back with a follow-on system: AlphaZero. This go-playing computer dispensed entirely with the human-played games and just learned by playing against itself over and over again. It plays like an alien.

Cryptography | BOOKS BY WILLIAM STALLINGS

This means it must be shown that no efficient method as opposed to the time-consuming brute force method can be found to break the cipher. Since no such proof has been found to date, the one-time-pad remains the only theoretically unbreakable cipher. There are a wide variety of cryptanalytic attacks, and they can be classified in any of several ways. A common distinction turns on what Eve an attacker knows and what capabilities are available.

In a ciphertext-only attack , Eve has access only to the ciphertext good modern cryptosystems are usually effectively immune to ciphertext-only attacks. In a known-plaintext attack , Eve has access to a ciphertext and its corresponding plaintext or to many such pairs. In a chosen-plaintext attack , Eve may choose a plaintext and learn its corresponding ciphertext perhaps many times ; an example is gardening , used by the British during WWII.

In a chosen-ciphertext attack , Eve may be able to choose ciphertexts and learn their corresponding plaintexts. For example, a simple brute force attack against DES requires one known plaintext and decryptions, trying approximately half of the possible keys, to reach a point at which chances are better than even that the key sought will have been found. But this may not be enough assurance; a linear cryptanalysis attack against DES requires known plaintexts with their corresponding ciphertexts and approximately DES operations.

Public-key algorithms are based on the computational difficulty of various problems. The most famous of these are the difficulty of integer factorization of semiprimes and the difficulty of calculating discrete logarithms , both of which are not yet proven to be solvable in polynomial time using only a classical Turing-complete computer. Much public-key cryptanalysis concerns designing algorithms in P that can solve these problems, or using other technologies, such as quantum computers.

For instance, the best known algorithms for solving the elliptic curve-based version of discrete logarithm are much more time-consuming than the best known algorithms for factoring, at least for problems of more or less equivalent size.

Thus, other things being equal, to achieve an equivalent strength of attack resistance, factoring-based encryption techniques must use larger keys than elliptic curve techniques. For this reason, public-key cryptosystems based on elliptic curves have become popular since their invention in the mids.

While pure cryptanalysis uses weaknesses in the algorithms themselves, other attacks on cryptosystems are based on actual use of the algorithms in real devices, and are called side-channel attacks. If a cryptanalyst has access to, for example, the amount of time the device took to encrypt a number of plaintexts or report an error in a password or PIN character, he may be able to use a timing attack to break a cipher that is otherwise resistant to analysis.

An attacker might also study the pattern and length of messages to derive valuable information; this is known as traffic analysis [52] and can be quite useful to an alert adversary. Poor administration of a cryptosystem, such as permitting too short keys, will make any system vulnerable, regardless of other virtues. Social engineering and other attacks against humans e. Cryptographic primitives Much of the theoretical work in cryptography concerns cryptographic primitives —algorithms with basic cryptographic properties—and their relationship to other cryptographic problems.

More complicated cryptographic tools are then built from these basic primitives. These primitives provide fundamental properties, which are used to develop more complex tools called cryptosystems or cryptographic protocols, which guarantee one or more high-level security properties. Note however, that the distinction between cryptographic primitives and cryptosystems, is quite arbitrary; for example, the RSA algorithm is sometimes considered a cryptosystem, and sometimes a primitive.

Typical examples of cryptographic primitives include pseudorandom functions , one-way functions , etc. Cryptosystems One or more cryptographic primitives are often used to develop a more complex algorithm, called a cryptographic system, or cryptosystem.

Cryptosystems e. Cryptosystems use the properties of the underlying cryptographic primitives to support the system's security properties.

As the distinction between primitives and cryptosystems is somewhat arbitrary, a sophisticated cryptosystem can be derived from a combination of several more primitive cryptosystems. In many cases, the cryptosystem's structure involves back and forth communication among two or more parties in space e. Such cryptosystems are sometimes called cryptographic protocols.

More complex cryptosystems include electronic cash [53] systems, signcryption systems, etc. Some more 'theoretical'[ clarification needed ] cryptosystems include interactive proof systems , [54] like zero-knowledge proofs , [55] systems for secret sharing , [56] [57] etc.

Legal issues See also: Cryptography laws in different nations Prohibitions Cryptography has long been of interest to intelligence gathering and law enforcement agencies.

Because of its facilitation of privacy , and the diminution of privacy attendant on its prohibition, cryptography is also of considerable interest to civil rights supporters. Accordingly, there has been a history of controversial legal issues surrounding cryptography, especially since the advent of inexpensive computers has made widespread access to high quality cryptography possible.

In some countries, even the domestic use of cryptography is, or has been, restricted. Until , France significantly restricted the use of cryptography domestically, though it has since relaxed many of these rules. In China and Iran , a license is still required to use cryptography.

William Stallings Cryptography And Network Security Books

Probably because of the importance of cryptanalysis in World War II and an expectation that cryptography would continue to be important for national security, many Western governments have, at some point, strictly regulated export of cryptography. After World War II, it was illegal in the US to sell or distribute encryption technology overseas; in fact, encryption was designated as auxiliary military equipment and put on the United States Munitions List.

However, as the Internet grew and computers became more widely available, high-quality encryption techniques became well known around the globe. Export controls Main article: Export of cryptography In the 's, there were several challenges to US export regulation of cryptography. Bernstein , then a graduate student at UC Berkeley , brought a lawsuit against the US government challenging some aspects of the restrictions based on free speech grounds.

The case Bernstein v. United States ultimately resulted in a decision that printed source code for cryptographic algorithms and systems was protected as free speech by the United States Constitution. The treaty stipulated that the use of cryptography with short key-lengths bit for symmetric encryption, bit for RSA would no longer be export-controlled. Since this relaxation in US export restrictions, and because most personal computers connected to the Internet include US-sourced web browsers such as Firefox or Internet Explorer , almost every Internet user worldwide has potential access to quality cryptography via their browsers e.

Many Internet users don't realize that their basic application software contains such extensive cryptosystems.

These browsers and email programs are so ubiquitous that even governments whose intent is to regulate civilian use of cryptography generally don't find it practical to do much to control distribution or use of cryptography of this quality, so even when such laws are in force, actual enforcement is often effectively impossible.

The technique became publicly known only when Biham and Shamir re-discovered and announced it some years later. The entire affair illustrates the difficulty of determining what resources and knowledge an attacker might actually have.

Another instance of the NSA's involvement was the Clipper chip affair, an encryption microchip intended to be part of the Capstone cryptography-control initiative. Clipper was widely criticized by cryptographers for two reasons. Sun Tzu wrote, Of all those in the army close to the commander none is more intimate than the secret agent; of all rewards none more liberal than those given to secret agents; of all matters none is more confidential than those relating to secret operations.

Secret agents, field commanders, and other human elements of war required information. Keeping the information they shared from the enemy helped ensure advantages of maneuver, timing, and surprise. The only sure way to keep information secret was to hide its meaning.

Early cryptographers used three methods to encrypt information: substitution, transposition, and codes. Monoalphabetic Substitution Ciphers One of the earliest encryption methods is the shift cipher. A cipher is a method, or algorithm, that converts plaintext to ciphertext. See Figure Figure 7- 1: Monoalphabetic Substitution Shift Cipher The name of this cipher is intimidating, but it is simple to understand.

Amazon is quietly doubling down on cryptographic security

Monoalphabetic means it uses one cipher alphabet. Each character in the cipher alphabet—traditionally depicted in uppercase—is substituted for one character in the plaintext message.

Plaintext is traditionally written in lowercase.

It is a shift cipher because we shift the start of the cipher alphabet some number of letters four in our example into the plaintext alphabet. This type of cipher is simple to use and simple to break.

In Figure , we begin by writing our plaintext message without spaces. Including spaces is allowed, but helps with cryptanalysis cipherbreaking as shown later.

We then substitute each character in the plaintext with its corresponding character in the ciphertext. Our ciphertext is highlighted at the bottom.

Breaking monoalphabetic substitution ciphers Looking at the ciphertext, one of the problems with monoalphabetic ciphers is apparent: patterns. One of them is whether it is used as a double consonant or vowel. According to Mayzner and Tresselt , the following is a list of the common doubled letters in English.

According to Zim , the following letters appear with diminishing frequency. Once the secret spread, simple substitution ciphers were no longer safe. The steps are If you know the language of the plaintext hidden by the ciphertext, obtain a page-length sample of any text written in that language. Count the occurrence of all letters in the sample text and record the results in a table. Count the occurrence of all cipher alphabet characters in the ciphertext.

Start with the most frequently occurring letter in the plaintext and substitute it for the most common character in the ciphertext. Do this for the second most common character, the third, etc. Eventually, this frequency analysis begins to reveal patterns and possible words. Remember that the letters occur with relative frequency. So this is not perfect. Letter frequency, for example, differs between writers and subjects. Consequently, using a general letter frequency chart provides various results depending on writing style and content.

However, by combining letter socialization characteristics with frequency analysis, we can work through inconsistency hurdles and arrive at the hidden plaintext. Summarizing, monoalphabetic substitution ciphers are susceptible to frequency and pattern analysis.

This is one of the key takeaways from this chapter; a bad cipher tries to hide plaintext by creating ciphertext containing recognizable patterns or regularly repeating character combinations.

Polyalphabetic Substitution Ciphers Once al-Kindi broke monoalphabetic ciphers, cryptographers went to work trying to find a stronger cipher. Finally, in the 16th century, a French diplomat developed a cipher that would stand for many decades Singh, The table consists of 27 rows. The first row of lower case letters represents the plaintext characters. Each subsequent row represents a cipher alphabet.

For each alphabet, the first character is shifted one position farther than the previous row. In the first column, each row is labeled with a letter of the alphabet. Write the key above the message so that each letter of the key corresponds to one letter in the message, as shown below. Repeat the key as many times as necessary to cover the entire message Identify the rows in the table corresponding to the letters in the key, as shown in Figure Each of these rows represents a cipher alphabet we use to encrypt our message.

Replace each letter in the message with its corresponding ciphertext character. Anyone with the key and the layout of the table can decrypt the message.

However, it is still vulnerable to attack. Charles Babbage and Friedrich Wilhelm Kasiski demonstrated in the mid and late s respectively that even polyalphabetic ciphers provide trails for cryptanalysts. Although frequency analysis did not work, encrypted messages contained patterns that matched plaintext language behaviors. Once again, a strong cipher fell because it could not distance itself from the characteristics of the plaintext language.

Transposition Ciphers Other attempts to hide the meaning of messages included rearranging letters to obfuscate the plaintext: transposition. The rail fence transposition is a simple example of this technique. To create the ciphertext, the letters on the first line are written first and then the letters on the second.

Figure 7- 4: Rail Fence Transposition The ciphertext retains much of the characteristic spelling and letter socialization of the plaintext and its corresponding language. Using more rows helped, but complexity increased beyond that which was reasonable and appropriate. Codebooks In addition to transposition ciphers, codes were also common prior to use of contemporary cryptography. A code replaces a word or phrase with a character. Figure is a sample code. Using codes like our example was a good way to obfuscate meaning if the messages are small and the codebooks were safe.

However, using a codebook to allow safe communication of long or complex messages between multiple locations was difficult. Figure 7- 5: Code Table The first challenge was creating the codes for appropriate words and phrases. After distribution, there was the chance of codebook capture, loss, or theft. Once compromised, the codebook was no longer useful, and a new one had to be created.

Finally, coding and decoding lengthy messages took time, time not available in many situations in which they were used. Codes were also broken because of characteristics inherent in the plaintext language. This provided the cryptanalysts with a finger hold from which to begin breaking a code. Nomenclators To minimize the effort involved in creating and toting codebooks, cryptographers in the 16th century often relied on nomenclators.

A nomenclator combines a substitution cipher with a small code set, as in the famous one shown in Figure Thomas Phelippes cipher secretary to Sir Francis Walsingham, principal secretary to Elizabeth I used frequency analysis to break it. Based on what we learn from the history of cryptography, a good cipher …makes it impossible to find the plaintext m from ciphertext c without knowing the key. Actually, a good encryption function should provide even more privacy than that.

Chapman & Hall/CRC Cryptography and Network Security Series

Achieving this ideal requires that any change to the plaintext, no matter how small, must produce a drastic change in the ciphertext, such that no relationship between the plaintext and the resulting ciphertext is evident. The change must start at the beginning of the encryption process and diffuse throughout all intermediate permutations until reaching the final ciphertext.

Attempting to do this before the late 20th century, and maintain some level of business productivity, was not reasonable. Powerful electronic computers were stuff of science fiction.

Today, we live in a different world. It is a block cipher mode that ostensibly meets our definition of an ideal cipher. However, it has already been broken… on paper. AES is a symmetric cipher, meaning that it uses a single key for encryption and decryption. Cryptanalysts have theoretically broken it, but we need better computers to test the discovered weaknesses. It will be some time before private industries have to worry about changing their encryption processes.

Figure depicts a simple block cipher. The plaintext is broken into blocks. Using a key, each block passes through the block algorithm resulting in the final ciphertext. One of the problems with this approach is lack of diffusion. The same plaintext with the same key produces the same ciphertext. Further, a change in the plaintext results in a corresponding and identifiable change in the ciphertext.

Cipher block chaining CBC , for example, adds diffusion by using ciphertext, an initialization vector, and a key. The initialization vector IV is a randomly generated and continuously changing set of bits the same size as the plaintext block.

The resulting ciphertext changes as the IV changes. The algorithm produces a block of ciphertext. The ciphertext from the first block is XORed with the next block of plaintext and submitted to the block algorithm using the same key.

If the final block of plaintext is smaller than the cipher block size, the plaintext block is padded with an appropriate number of bits. This is stronger, but it still fell prey to skilled cryptanalysts.

AES, another block cipher mode, uses a more sophisticated approach, including byte substitution, shifts, column mixing, and use of cipher-generated keys for internal processing NIST, It is highly resistant to any attack other than key discovery attempts. This does not mean it is broken in practice; it is still the recommended encryption method for strong data protection.

Key Management The processes underlying all widely accepted ciphers are and should be known, allowing extensive testing by all interested parties: not just the originating cryptographer. We tend to test our expectations of how our software development creations should work instead of looking for ways they deviate from expected behavior. Our peers do not usually approach our work in that way.

Consequently, allowing a large number of people to try to break an encryption algorithm is always a good idea. Secret, proprietary ciphers are suspect. In either case, only the relentless pounding on the cipher by cryptanalysts can determine its actual strength.

Now that we have established the key as the secret component of any well-tested cipher, how do we keep our keys safe from loss or theft? If we lose a key, the data it protects is effectively lost to us. If a key is stolen, the encrypted data is at higher risk of discovery.

Related articles:


Copyright © 2019 terney.info. All rights reserved.
DMCA |Contact Us