Newsgroups: sci.crypt
Path: cactus.org!ritter
From: ritter@cactus.org (Terry Ritter)
Subject: Re: Block Mixing Transformations
Message-ID: <1994Mar16.093035.1330@cactus.org>
Keywords: DES replacement, Large blocks
Organization: Capital Area Central Texas UNIX Society, Austin, Tx
References: <1994Mar13.051515.27175@cactus.org> <1994Mar16.010710.
+ 4943@mnemosyne.cs.du.edu>
Date: Wed, 16 Mar 1994 09:30:35 GMT
In <1994Mar16.010710.4943@mnemosyne.cs.du.edu> colin@nyx10.cs.du.edu
(Colin Plumb) writes:
>Okay, time to comme clean: I get a bit annoyed when I see a fancily-formatted
>"report" that might fool some people when such glaring points as this are
>wide open. It seems enormously pretensious and makes me think poorly
>about the author. Sorry if it's unkind, but it's true that a part of
>my mind is saying "I wish Terry would quit posting his `clever' ideas
>until he learns thing one about cryptanalysis."
Frankly, I think this says more about Colin Plumb than Terry Ritter.
When Plumb publishes a practical cryptosystem which he can *prove*
secure, then I'm sure the rest of us will take some notice. Until
then, we all must learn to deal with--and depend upon--not only
proposals, but actual fielded systems of unknown strength.
Frankly, I think the *practice* of Science--and especially
cryptography--is benefitted more from the exposure of reasonable
wrong approaches than right ones. Nobody comes up with only great
ideas; to publish only after a great one is finally found and
confirmed is to hide what the work of Science actually is. The
work is the process, not the result. There will always be another
result.
And, sci.crypt is not a refereed publication. I do not expect
to treat it like one.
>Yes, it's called polyalphabetic substitution and ciphertext-only attacks
>are trivial.
Well, the term "polyalphabetic substitution" is normally applied
to stream ciphers instead of block ciphers. To apply it here,
we must already have come to the conclusion that the mixing is
so ineffective that individual characters are enciphered--through
the entire cipher--by independent substitutions. This seems
unlikely, but in any case it has not been shown. Thus, the
term is inappropriate.
Perhaps, after some study, Plumb would recognize that substitution
is "weak" precisely to the extent that it can be separated and
investigated. Plumb might also want to look at the construction
of DES for an example of the use of small, extremely-weak
substitutions as major parts of a very respected cryptosystem.
>Now, the mixing probably makes it harder, but I'm not convinced it's
>that much harder.
A critical comment made without benefit of analysis? "It seems
enormously pretensious and makes me think poorly about the author."
>There are two desirable properties in a mixing: non-linearity (see all
>the papers on bent functions, which are functions a maximum Hamming
>distance from any affine function) and the Strict Avalanche Criterion.
1) Keyed substitution is inherently non-linear. (The probability
of a linear randomized substitution is vanishingly small.)
2) It is easy to build weak ciphers which have good avalanche
properties.
The whole point of my proposal is to separate the mixing from the
strength. A cipher with extremely fast, weak mixing may well be
preferable to the slow form Plumb proposed, which turns out in fact
to be 20-50 times slower. (!!!!)
Think about it: Using the fast mixing I proposed, one can afford
to make a block cipher with perhaps 20 times as many stages--and 20
times as many unique substitution tables--as one with Plumb's
"improved" mixer. I would say that there is at least a reasonable
chance that the cipher with the fast mixer--having up to 20 times
as many stages--would be a stronger cipher. And if it could be
better analyzed for strength, use fewer stages and operate faster
as well, so much the better.
>This is immediately recognizable as Pascal's triangle, but serves to
>illustrate the fact that a 1-bit change in the input is slow to propagate.
One of the characteristics I mentioned for the fast mixing function
was the fact that a single bit-change in *either* input block would
produce at least (and probably) a bit change in *both* output
blocks. A modest propagation, true, but it rolls through 61 mixing
operations per stage. The reason for cascading the mixings is to
spread the various changes throughout the field of substitution
inputs.
As soon as even one bit changes into a substitution, we get an
inherent "avalanche" in the output from that substitution. Thus,
the role of the mixing is to get bit changes into (potentially)
all the substitutions.
>Because you only hav three levels of S-boxes, I suspect the whole thing
>can be reduced to a system of simultaneous equations of reasonable size
>and solved with a bunch of known plaintext that is basically the size
>of the unicity distance.
Big talk. Let's see it.
>Cryptography is harder than cryptanalysis.
Not in my experience.
>Cryptanalysis (of modern ciphers)
>is extremely difficult. I'm not qualified to do either, although I'm
>slowly working my way through the cryptanalytic literature in an attempt
>to develop a modicum of skill.
Interesting, then, that Plumb feels so capable of judgement.
>If you look at papers proposing ciphers, you'll see that the onus is
>on the proposer to show that a number of known methods of cryptanalysis
>are ineffective.
If someone were to actually read my Block Mixing article, s/he
would see that it was concerned with the existence of such
structures, and with one fast structure in particular.
That structure works. It reversibly mixes without expansion.
The article is not in error. I did expect the mixing to be
stronger than it turned out to be, but "strength" was specifically
disclaimed as an issue.
The article was not concerned with overall cryptosystems, other
than to explain why one might want a Block Mixing structure in
the first place, and to propose some cipher constructs which
might be worth investigating. It specifically stated:
"These are new ciphering architectures. Clearly, it is not
known how strong these constructs would be."
and
"Other opportunities exist when constructing completely new
block ciphers. These might, for example, be based on byte-wide
key-permuted substitutions, thus avoiding differential attacks
on fixed "optimal" tables."
Since the article was specifically limited to detailed discussion
of the *mixing*, it does seem a little odd that it would be
criticized for not being a proven cipher.
>"Show" does not mean "I couldn't do it", but "Here
>is a proof that nobody can do it."
Ah, then we must *have* practical cryptosystems which are proven
secure. Which ones would those be, exactly?
>The requirement for proof can be
>weakened to an argument, but Xuejia Lai's PhD Thesis gives a good
>example to try to emulate.
While emulating someone else's thesis may be Plumb's highest goal,
it certainly is not mine.
Postings to sci.crypt are not, and should not be considered to be,
scientific publications. We need room to speculate, room to fail
without career implications, and room to consider alternative
approaches that your PhD advisor may not like. One of these may
lead to better solutions. Or not.
We have a lot of people laboring in secret because they are afraid
to say anything and get the kind of response we see from Plumb.
We are all poorer for it; those who read because we don't read the
new ideas, and those who don't write, because it is through error
that they learn more than they know.
>One cipher that might be worth exploring is one based on an FFT-like mixing
>pattern (all butterflies the same size), with S-boxes after each round of the
>FFT. But if you were to stick to 8-bit mixing, you could merge the mixing
>function with the previous S-box by having each of the two input S-boxes
>to the mixer emit 16 bits, which are XORed and the two halves used as outputs.
>And that would permit much more complex mixing functions with no increase
>in cost.
It would certainly be a different cipher. But it would be moving in
the opposite direction from my proposal, which specifically intends
to separate the concepts of "strength" and "mixing" each into their
own particular mechanism.
One of the major problems with modern cipher design is the inability
to *measure* "strength." It is very easy to say "this mixer is
nonlinear so it must be stronger." It is easy to *say*, but not so
easy to prove. And very few academic papers even begin to approach
the idea that "stronger" may not be worth the expense, if it takes
20 times as long.
I have proposed to finesse the issue of unknown mixing strength
by placing the strength in particular simple components in a
relatively simple structure. If this approach is successful, we
can avoid speculating about the "strength" of the mixing, since
it is exactly this speculation which leads to the question of "How
many rounds do we need." Where is the formula which provides that
answer?
If the approach is not successful, we know yet one more thing
that does not work, and, hopefully, why. And this knowledge must
benefit the continued search.
>But remember, these are all off-the-cuff ideas; if I were to start work
>now and not find any holes after 6 months of effort, it might be worth
>considering using them to protect data.
I certainly am not using any of the many schemes I suggested in
the article to protect data.
However, I have not seen a sufficient problem with it to prevent
continued investigation in that direction.
Until that succeeds, Plumb might consider using my Penknife email
cipher for DOS. Admittedly quite a stretch from the theories
tossed around here, Penknife is an honest, fielded, commercial
stream-cipher product which does not infringe anyone else's patent.
It is based on my own patent for Dynamic Substitution, which was
described in my (refereed) Cryptologia article of October 1990.
Penknife also uses random-number techniques described in my
Cryptologia article (59 pp., 213 refs) of April 1991.
---
Terry Ritter ritter@cactus.org (cactus.org dies Friday)
ritter@io.com (perhaps temporarily)
ritter@rts.com (perhaps temporarily)