Path: cactus.org!milano!radar!cs.utexas.edu!qt.cs.utexas.edu!zaphod.mps.ohio-
+     state.edu!wupost!uunet!bonnie.concordia.ca!thunder.mcrcim.mcgill.edu!
+     sobeco!ozrout!elevia!alain
From: alain@elevia.UUCP (W.A.Simon)
Newsgroups: sci.crypt

Subject: The Braided Stream cipher
Message-ID: <1991Jul25.135326.12609@elevia.UUCP>
Date: 25 Jul 91 13:53:26 GMT
References: <8300@cactus.org>
Organization: The Electronic Path - Global Village
Lines: 149
Remark: could you please conserve the references - thank you

In <8300@cactus.org> ritter@cactus.org (Terry Ritter) writes:

	An excellent and constructive critical analysis of the
	cryptographic system I have proposed.  I will reply
	here and now to the most obvious points, and I will study
	the more involved ones at leisure, for subsequent reply.

	Thanks are in order for this rich material.

> Definition:
>      The Braided Stream cipher, described by William Simon, consists of a
> cryptographic combiner and a key stream:
>      * The combiner is a data multiplexer, which takes a single bit from
> one of multiple data channels.  The unselected channels do not advance,
> and their data bits remain ready until selected.
>      * There are at least two data channels.  One or more of the channels
> may be a random key stream for confusion purposes.
>      * The original key stream is supposed to be really random.

	Good summary.  However, it misses one key point: the addition
	of random length of random noise is the source of cryptographic
	strength; the multiplexer is just a convenient implementation.

> Claimed Advantages:
>      * "simple," "fast," "high levels of confidence"
>      * "key management is inherent in the design"
>      * "not vulnerable to progress in mathematics or computational
> technologies"
> Some Implications:
>      * The original key may be "really random."  But, if so, then we might
> as well use that key with XOR and get the provably-secure one-time pad.

	Indeed, but with or without braid, the insertion of the random
	data provides a level of security that is beyond a XOR on the
	plaintext alone.  See subsequent articles for details, but if
	you think of encryption as an attempt to reach the maximum entropy
	of the string [log2(stringlength)], then a longer string will
	offer more entropic potential.

>      * ANY secure cipher can also handle "key management" data simply by
> enciphering it as a message.  The reason this is not done is that the
> cipher key may have been "obtained," and the plaintext may be fully
> exposed, INDEPENDENT of the security of the cipher itself.  Shipping key
> material under these conditions will just assure that the cipher will
> continue to be insecure.  Key management is not a special feature of this
> particular cipher.

	Right, it is not; the SECURITY of the scheme IS a big feature of
	this particular system.  And even a known plaintext attack can't
	provide a useful piece of key.

>      * Key material which is reused, even in a complex way, is no longer
> "really random," and thus may, at least potentially, be broken.

	|8-)

	If you start with an extremely long and fully random string,
	then process it with fully random incoming noise, it may (!!!)
	exhibit some non random behaviours, it may even exhibit some
	dynamic system behaviours (chaos, attractors or periodicities...),
	but the opposition has no way of predicting what these may
	eventually be.  They know neither the contents, nor the length
	(crucial), of any of the various key streams (the original one
	as well as the xmitted ones).

>      * For the one-data one-confusion system, if the key stream has a
> balanced distribution, the ciphertext expansion will be 100% (or the data
> rate will be reduced to 50% of the unenciphered rate).  In contrast,
> a one-time pad does not expand the ciphertext at all.

	This happens to be the largest part of the strength of the
	system.  And the extra length is not wasted since I use it
	to distribute fresh key material.

>      * Dynamically, any particular data channel may run fast (and thus be
> less hidden and more exposed) or slow (for an even slower data rate).
> Transiently, one channel may be selected repeatedly, and thus have no
> protection at all.  Clearly, at some point, a "less hidden" message will
> become recoverable.

	The key being fully random, you are right, there could be long
	runs of 1's or 0's.  But remember that, a) we are working with
	bits, so we would need at least 8 bits in any given run, and
	several runs in a row, or a VERY long run to give VERY little
	away; b) due to the mixing of bits from the plaintext and the
	random source, there could be many more artifact "bytes"
	given away than real ones.  One point militates in favor of
	"discovering" cleartext bytes: due to the bitwise multiplexing,
	we can start any given byte at any bit whatsoever in the stream;
	you can bet you will find a large number of readable text bytes.  
	Therefore, whatever "less hidden" stuff appears could be entirely
	misleading, and probably will be.

>                      Cryptanalysis of the two-channel system could proceed
> from the standpoint that the message is passing through a noisy channel.

	Not at all.  A noisy channel increases the entropy of the
	stream towards its limit.  I propose to go beyond by enlarging
	the string.  Noise would flip bits at random, more like a XOR.
	
> In this peculiar kind of noisy channel, data bits are not lost, but an
> arbitrary number of arbitrary confusion bits may separate data bits.  Thus,
> any statistics in the plaintext are not lost, but merely "mellowed out" by
> intervening confusion data.

	Mellowed out is kind of mellow a word to describe what happens
	here.  Furthermore, statistical profile is usefull only at the
	character level, we are working with bits, and we force the
	opposition to work with bits only.

> Analysis
>      If we are to avoid analyzing EVERY POSSIBLE configuration, we must
> pick one and stay with it.  The weakest system may be the two-channel
> scheme. We choose the weakest system to analyze in the hopes that it may
> help us learn to attack a stronger system.

	Agreed.

> The obvious way to analyze the system is to analyze its component parts.
> Because "key management" can be applied to any stream cipher, its analysis
> is irrelevant to the strength of this particular cipher.  Moreover, the

	Not quite true: the fact that I can garantee secure key
	management means you can't make guesses about my keys.

> original key is supposed to be "really random," and no precise technique has
> been described for the extension of that key, so no precise analysis can be
> applied.  Thus, we are left with the combiner:

	There was article on that subject, which I will send to you
	separately (A key management scheme).  There was an earlier
	message with a proposal for a key management language, but
	I dropped the idea.

> Combiner Characteristics
>      Stream cipher combiners have been one of the major research topics
> in the past decade [4,5,7], although most of those in the literature are
> non- reversible.  Such combiners mix multiple LFSR sequences and yield a
> resulting sequence with large linear complexity, in which the input
> sequences are statistically independent of the output.

	LFSR??

>[...skipping over material I'll address later, when I fully comprehend it...] 

>  Terry Ritter   cs.utexas.edu!cactus.org!ritter   (512) 892-0494
-- 
 William "Alain" Simon
 alain%elevia@larry.mcrcim.mcgill.edu