From: (Terry Ritter)
Newsgroups: sci.crypt

Subject: Re: Algorithms
Date: 18 Nov 1994 03:13:02 -0600
Organization: UTexas Mail-to-News Gateway
Lines: 138
Message-ID: <>

 In <1994Nov15.231930.1060@Princeton.EDU>
 (David A. Wagner) writes:

>>  The sweeping generalization that Triple <anything> is *necessarily*
>>  stronger than <anything> on its own is false by contrary example,
>>  and the groupiness of <anything> is irrelevant.
>I agree; but that's not the point here.

 It may not be *your* point, but it *is* *my* point.

>It does seem reasonable
>to believe that triple DES is stronger than DES.  Why?  Because
>crypto experts have tried their darndest to break triple DES,
>without much success.  Because single DES seems to be a very
>well-designed primitive -- except for the short keylength.

 There are several more problems with DES: one is the small block
 size.  If 2**56 keys is a problem, 2**64 elements should be a
 similar problem.  Why do we not consider 2**8-element tables
 secure?  Are 2**32-element tables secure?  I claim that if 2**64
 elements is not a problem now, it soon will be, in any case
 certainly long before the next 20 years.

 Yet another problem is the internal use of tiny, fixed (and thus
 well-known) substitution tables.  This is just asking for trouble.
 We know how to do better than this, and we can afford to do far
 better, even as the original designers could not.

>[For example, read about how DES becomes more resistant to both
>differential and linear cryptanalysis when more rounds are added.
>Read about how the best attacks on two-key triple DES require
>2^56 memory, 2^56 chosen plaintexts, and 2^56 operations; or else
>(from memory here) much much more than 2^56 operations, and a whole
>bunch of known plaintexts -- see Crypto '90.  Anyone know of any
>attacks on three-key triple DES better than brute force?]

 Again, *this* is beside my point.

 If I had a workable attack I could defeat your argument, but
 requiring me to have and disclose such an attack before you will
 move to a stronger cipher must defeat your own security.  It
 is instead necessary to anticipate attacks, instead of simply
 responding to attacks as they become disclosed.  Attacks may
 exist and we may not know them, and yet, to provide good crypto,
 we must defeat them anyway.  Thus we must assume that such
 attacks exist.  This is the way the game works.

>Ok, so what are the real underlying problems, in your opinion?

 First, much of the commercial security world has been essentially
 locked into DES because it is the only cipher "certified" for use
 by the US Government.  Because most systems are not set up to use
 multiple ciphers, the idea that DES has become attackable is scary
 and rejected.  DES is being used beyond its advisable life.

 We should at all costs avoid being trapped into yet another single
 standard cipher.  Instead, we should standardize the ability to
 negotiate a mutually-agreeable cipher by textual name, instead of
 some number assigned by a standards body.  Standard interfaces
 should allow the dynamic replacement of ciphers which are found
 weak, as soon as such indications occur.  Replacement ciphers which
 defeat new attacks could be made available in weeks or months, and
 used automatically.

 Next, Triple-DES is being promoted as the major savior, since
 it will essentially function like DES, albeit with more key
 material.  Surprisingly, many systems people hate Triple-DES,
 because of the increased processing overhead.  It is a facile
 argument to say "get a larger computer" when it is difficult to
 keep up with processing growth as it is.  Cipher execution is
 an obvious overhead which has the potential to delay *every*
 *transaction* in a modern data-based organization.  Because of
 the overhead of the proposed replacement, the old solution is
 being retained in the face of evidence of weakness.

 Nevertheless, Triple-DES is being touted as the saving force,
 based on assumptions that it is more secure than DES alone.
 As usual, there is no proof of this.  In fact, it seems very
 reasonable to me that if some defect is found in DES, the use of
 DES three times may not hide that defect.  There are alternatives,
 and they include using larger, stronger ciphers, perhaps based on
 DES itself, or to at least use three levels each with a different
 block cipher, or, more generally, totally different, much-stronger,
 and much-faster ciphers.

 At least one attack exists in which Triple <block cipher> is not
 stronger than <block cipher> alone, and that is an attack on the
 overall permutation.  We assume that that attack is prevented by
 using CBC.  But it is a real example of a real attack, and we have
 no particular reason to believe that it is the only possible attack
 in that class.

 In fact, the function of a block cipher is to emulate, as well as
 possible, a huge keyed substitution table.  It should be evident
 that any simple machine can do this only imperfectly.  With
 increasing advances in processing, it may be possible to start
 to delineate that imperfection.  In particular, it may be that
 the output of the cipher is in some way defined or bounded in
 permutation space.  Such a bound might be large enough to not
 affect DES itself, and yet make Triple-DES almost as weak as DES.

 This is a situation which separates cryptography from classical
 Science:  In Science, it is expected that something not be stated
 unless it is proven true in some way.  Alas, there is no final
 proof in cryptography, yet we hear claims just the same, because
 it is necessary to somehow describe something we cannot measure.
 The conventional approach:  "DES is strong unless you have a
 provable attack" does not serve us well to define the cipher of
 the future.  Not only do we not know whether or not someone has
 some effective attack *today*, we cannot even know what *tools* will
 be available for future analysis and understanding.  Our only hope
 of cipher success is the design of ciphers which are vastly stronger
 in every way we know, thus hopefully to serve well into tomorrow.

 I think we want new ciphers which will last at least a couple of
 decades.  We are constrained to build these ciphers only on the
 basis of research that we know, research which was conducted in the
 past, with the capability of past tools.  But the new ciphers must
 survive in an environment of the future, with massively more-
 effective tools.  We all know that past research (especially the
 sort of negative information we get in cryptography) does not
 necessarily imply the outcome of the future.  We know DES is weak,
 and we know DES is small, and if we are going to fix the problem
 of weak DES, we should really *fix* it, instead of doing the same
 old thing (three times as hard) and wishing and hoping it will work
 for the next twenty years.

 Terry Ritter