Path: news.io.com!news.fc.net!news3.net99.net!news.cais.net!news.sprintlink.
+     net!cs.utexas.edu!not-for-mail
From: ritter@io.com (Terry Ritter)
Newsgroups: sci.crypt

Subject: Re: Variable-size block ciphers
Date: 27 Aug 1995 15:11:59 -0500
Organization: UTexas Mail-to-News Gateway
Lines: 296
Sender: nobody@cs.utexas.edu
Message-ID: <199508272011.PAA17751@tristero.io.com>
References: 
NNTP-Posting-Host: news.cs.utexas.edu

 In  John Kelsey 
 writes:

>- From what I've seen of this construction, it seems odd to me that
>you don't need more rounds to handle larger blocks.  (Intuitively,
>getting full diffusion through a 256-bit block should take more
>work than getting full difusion through a 64-bit block.)

 Apparently there is "diffusion," and then there is "diffusion."

 It is certainly clear that a substitution table will conduct any
 input bit-change to an output bit change.  It is also clear that
 an exclusive-OR chain *must* conduct any of those bit changes to
 all subsequent elements.  Given shuffled substitution tables at
 the next level, it is clear that this will produce a randomized
 result (in the subsequent elements).

 My point is that experiments show that multiple layers of this
 structure test very much like a "true" overall diffusion.  To avoid
 particular attacks, it may be that at least two layers must go in
 each direction, leading to five confusion layers.  But, once we
 have full nonlinear diffusion, one more confusion layer should
 provide "strength."


>Unfortunately, I don't have access to a web browser that will let
>me view images, so I'm stuck with ASCII diagrams--so I may be
>missing something.

 Ah, now I see the problem.  I assumed that I was the last person
 in the known universe to have webbing capabilities.

 Although I have used ASCII diagrams in the past, I was never
 satisfied with the diagrams I made for these VSBC things.  The
 .GIF's turned out *so* much better that I didn't bother putting
 ASCII diagrams in the HTML document on the web page.  I didn't
 (and don't) really intend to post the design document to the
 newsgroup.

 I believe that it is academically important to have webbing
 available, at least occasionally.  To do this, one first needs an
 Internet Service Provider which handles SLIP or PPP (preferred)
 logons.  Next, one needs a 386/486/P5 PC, typically one running
 Microsoft Windows.  Then, one needs protocol "middleware" like
 Trumpet Winsock (shareware), which the ISP may provide.  Last, one
 needs a browser, typically Netscape, which is free for individuals.
 That's pretty much it, and if you know somebody with a similar
 set-up, you can probably get up in an hour or so.  It *is* worth
 the effort.


>>    1. There will be some applications in which data-expansion can
>>       be eliminated.
>
>This is do-able with any fixed-length block cipher, if you're
>willing to do some extra work in encryption and decryption.  If you
>eliminated data expansion by changing the block size of the last
>block, it seems like you'd be saving yourself little work.

 Define "do-able":  Typically this is done by switching to stream-
 cipher mode for the last block.  But if this were really acceptable,
 it would be used for the whole message:  Normal stream-cipher mode
 has no data diffusion at all, compared to the full diffusion we
 expect and demand in a block cipher.

 So, we save having two modes, potentially additional hardware for
 one, and the complexity of switching between them.  There is some
 advantage; just how much depends on the individual case.

 These advantages are cumulative: a new cipher need not be the
 best in every possible category.  Nor do the VSBC designs need to
 better every known cipher in every way.  In certain important
 applications, for example database ciphering, they could be a
 significant improvement over anything now available.  In other
 applications, they could be at least competitive.


>>    2. There will be other applications which can avoid buffering
>>       by directly ciphering application-dependent field sizes.
>>    3. There may be a few applications (perhaps database ciphering,
>>       or voice CODEC's) which actually can use a cipher of
>>       dynamically-variable size.
>>    4. A single design can handle all of the above, as well as
>>       64-bit blocks, 256-Byte blocks and 1 KB blocks.  Thus, a
>>       single cipher can handle database ciphering, disk sector
>>       ciphering, and communications.
>>    5. Tiny versions of the same cipher can be exhaustively studied,
>>       and this can be done by parameterizing production-capable
>>       code.  (Validating results between this and the actual
>>       production code then validates the high-efficiency
>>       realization.)
>
>All four of these depend on the idea that it's possible to come up
>with equally-secure versions of your block cipher for any given
>size.

 These are advantages relating to variable size, which is in fact
 the issue I have addressed in these designs.


>I really would like to see some explanation of why you are
>convinced that this is the case.

 "Convinced" is not a word I would use.  I guess that I currently
 accept (as a reasonable basis for action) that a full, nonlinear
 balanced, bit-level diffusion implies overall "strength" in just
 one more confusion layer.  I can measure diffusion experimentally.
 I don't know how to measure "strength," and neither does anybody
 else who is talking.  It is consequently difficult to build a
 cipher with measurable "strength."


>It's easy to create variants of
>Blowfish, SAFER, and many other ciphers with different block sizes,
>but it looks like changing the block size changes the security
>properties of these ciphers, perhaps requiring more rounds to be
>secure.

 In which case, "creating variants" does not seem all that "easy."

 Indeed, most Feistel-style block ciphers will have power-of-2
 block sizes.  One *can* use smaller mixings, of course, but that
 seems contrary to the usual practice of a broad half-block mixing.

 It is *NOT* easy to diddle with existing block ciphers to provide
 a god block cipher for data of different sizes.  One can, of
 course, always use a stream cipher, but then one gets into the
 "re-origination" issue (which I addressed in my Penknife commercial
 stream cipher for e-mail).

 "Re-origination" refers to the a cipher starting in the same state
 multiple times.  This sort of thing is necessary when we want to
 provide for messages being ciphered in some unknown sequence (like
 network packets).  With a stream cipher working on bytes, this
 immediately leads to an attack in which The Opponent can collect
 many messages which start out in the same way.  We can think to
 include other data (essentially keying data) with each "message,"
 but this may not always be possible.  On the other hand, if we
 have the ability to cipher blocks of variable size, most of
 which are very large (20..1500 Bytes), a similar attack is
 difficult or impossible.


>> There are other advantages to VSBC technology:
>>
>>    1. These ciphers can closely approach the ideal of "overall
>>       diffusion," in which a change to *any* input *bit* changes
>>       *every* output *bit* with probability 0.5.  This is what
>>       we should expect and demand from a quality block cipher.
>
>Here, you're just saying that you have full diffusion, right?

 The word "just" in this sentence implies a fact not in evidence.

 It is not trivial or easy to get full diffusion.  Admittedly,
 diffusion appears as a side-effect in Feistel-type iterated block
 ciphers, so one might conclude that it just "happens" as a side-
 effect of a strong cipher.  But diffusion is *necessary*, and it is
 difficult to discuss diffusion in the context of Feistel ciphers
 *because* in these ciphers diffusion is only a side effect.


>This
>is a necessary but not sufficient condition for block cipher
>security, since without full diffusion, it's possible to perform
>various divide-and-conquer attacks on the block cipher.

 Right.  My point exactly.

 But I would further say that, once one has a non-linear, balanced,
 full diffusion, then one more confusion layer (e.g., a layer of
 keyed fencing substitutions) should be sufficient for strength.
 At least that seems to me to be a reasonable position given the
 current evidence.


>>    4. Additional independent strength is easily added with new
>>       layers, without disturbing the rest of the design.  In
>>       contrast, simply increasing the number of rounds which do
>>       the same old algorithmic thing may not improve strength at
>>       all.  (Is a 32-round DES stronger than 16-round DES?)
>
>Can you explain this a bit more clearly?  I'm not too sure I
>understand how you can be more certain that adding layers in your
>design adds strength, than you can be that adding rounds to DES
>adds strength.

 Triple-DES is not just DES with more rounds, it is DES with more
 rounds and also more *keys*.  Simply adding rounds to DES is not
 the way to improve strength.

 On the other hand, if we add new diffusion and confusion layers to
 an existing VSBC design, we typically add a new layer of keyed
 substitution operations.  This is additional state which The
 Opponent must confront.

 It would be difficult to deny that, if four layers have strength
 x, then five layers must have strength x+delta or x*delta or
 x**delta or some such.  The real issue is whether the delta is
 low or the advantage weak, for then we may need to add too many
 layers to be practical.  I can only say that my bit-diffusion
 experiments indicate good diffusion (in *some* designs; a
 surprising number did not) with as few as three one-way diffusion
 layers.  I take full diffusion as a precursor to strength.


>>    5. VSBC technology produces a clear, regular, understandable
>>       structure in which the role of each element can be addressed
>>       mathematically.  In contrast, irregular and ad hoc designs
>>       depend on the mysterious strength of complex manipulations.
>>       While such manipulations may be "mathematical," they often
>>       do not carry any useful analytical properties.  Absent a
>>       comprehensive theory of design, such ciphers can only be
>>       analyzed individually, which is one of the major problems
>>       that cryptography has today.
>
>I would say that a number of block cipher designs fit this
>description, such as balanced Feistel networks with subkeys applied
>outside the f() functions, block cipher designs based on cellular
>automata, etc.

 Fine.  Let's just test this, shall we?  Given the well-known
 definition for DES, please provide a formula to describe the number
 of rounds needed for "full diffusion."  Also provide formulas or
 other reasoning showing that the resulting diffusion is "balanced"
 in each bit.  Now extend these formulas for arbitrary functions f().

 Given *any* keyed (randomized) substitutions, I suggest that we
 can make a good argument that at least three one-way diffusion
 layers (in two directions) and four confusion layers will produce
 something approaching experimental "full diffusion."  (I have so
 far approached the issue experimentally.)

 Another such set of layers would then protect against "fix
 in-the-middle" attacks, but these are already impractical in the
 context of a huge block size and huge keyspace.  Indeed, the
 ability to have and easily use a large keyspace is another
 advantage of the VSBC design over Feistel designs.  (My
 experimental versions all use a 992-bit keyspace and a huge
 Jitterized RNG like that in my Cloak2 commercial design to
 shuffle the tables.)

 One of the main problems with analyzing Feistel designs is that
 their structure supports arbitrary non-invertible f() functions.
 This is a problem because such functions do *not* guarantee that
 diffusion will *necessarily* be propagated in each mixing.
 This makes it difficult to make strong statements about diffusion
 without restricting f() to functions of some particular form.
 Presumably there are multiple such forms which support such
 reasoning, but we have not been given the Feistel design handbook.

 With respect to Cellular Automata, it seems to me that the bloom
 has faded from the CA rose.  CA designs were often touted as great
 stream-cipher RNG's which could never be analyzed, but they turned
 out to be just as analyzable as most other state-machine RNG's.


>> SAFER K-64 does not use "fencing" layers (arrays of keyed
>> substitutions), and does use iterative "rounds" which my ciphers
>> do not.  Still, if somebody has a problem with weak mixing (as a
>> lot of people did on sci.crypt last year), I encourage them to
>> take it up with Massey.  :-)
>
>Actually, you could go back to the earliest work on SP networks--
>fixed bit permutations are weak mixing layers, but nobody seems to
>doubt that they're useful in designing strong ciphers.

 Well, I doubt that anybody really denies that permutations can
 be useful.  In this particular case, I was looking for mixing
 transformations which would guarantee to propagate diffusion
 (element diffusion) across a large block.  Then these element
 values were *confused*, either by DES or by keyed substitutions.

 The fast, scalable, invertible, balanced transformations which I
 found were linear (although I subsequently have used an order-256
 randomized Latin square which is of course exceedingly nonlinear
 but also non-scalable).  The argument was made that the mixing
 transformations *obviously* did not contribute to strength
 *because* they were linear.  This argument was specious, because
 the transformations were not *intended* to provide primary strength
 per se, but were intended mainly to provide *provably* balanced
 diffusion *as a precursor* to strength.  This result was then fed
 into keyed substitutions which I believe *do* provide strength
 *under those proven conditions*.

 This is a completely different design strategy, one which separates
 confusion and diffusion.  The advantage is that we *can* generate
 *provable* and understandable small confusions.  We can also
 generate *provable* and understandable diffusion of various kinds.
 Then we can combine these things.  The one-way chain-style
 diffusion used in the VSBC designs follows that same path.

 ---
 Terry Ritter   ritter@io.com   http://www.io.com/~ritter