Secrecy through Synchronization Errors Jason Castiglione Aleksandar Kavcic University of Hawaii University of Hawaii Email: [email protected] Email: [email protected] Abstract—In this paper, we propose a transmission scheme Shared source of that achieves information theoretic security, without making randomness assumptions on the eavesdropper’s channel. This is achieved by a transmitterthat deliberately introduces synchronization errors (insertions and/or deletions) based on a shared source of ran- 5 domness.Theintendedreceiver,havingaccesstothesameshared Xn (n) Yn 1 source of randomness as the transmitter, can resynchronize the Synchronization Resynchronize 0 receivedsequence.Ontheotherhand,theeavesdropper’schannel Error Channel 2 remains a synchronization error channel. We prove a secrecy capacitytheorem,providealowerboundonthesecrecycapacity, ALICE BOB n and propose numerical methods to evaluate it. a J (n) 4 I. INTRODUCTION EVE 2 With the appearance of wireless communications in the ] smallest of devices, communicating securely has become of Fig.1. SynchronizationErrorBasedSecrecySystem T paramount interest. We see an abundance of small devices I with limited computing capability having access to potentially s. sensitive data. In addition to limited power and computational by the transmitter (see Fig. 1). Instead of transmitting the c capability, another limitation that these devices have in com- codeword Xn, the transmitter uses a secret shared source of [ monisthattheytransmitovernoisychannels.Theyallrequire randomness to randomly inject synchronization errors into the error correction codes and specialized processors for reliable transmitted sequence. The intended receiver has access to the 2 v communications. A natural question arises, can we combine shared source of randomness and can thus resynchronize the 2 error correction codes and security? received sequence, although not necessarily recover Xn. The 4 eavesdropper, on the other hand, does not have access to the Wyner’s seminal paper [1] proved it was possible to com- 5 shared source of randomness2, and is thus forced to attempt municate securely using wiretap codes, assuming the compos- 3 to decode the transmitted codeword. Under this constructed 0 itechannelswerediscreteandmemoryless.CsiszarandKorner scenario,asweshowinthispaper,theBloch-Lanemansecrecy . [3] generalized Wyner’s results to broadcast channels, where 1 results [5] hold, and we therefore can communicate securely. the unintended user does not neccesarily listen to the output 0 of the main channel through a noisier channel. Hayashi [2] This method is similar to the one in [6], but instead of 5 furthergeneralizedthepreceedingresultstoarbitrarychannels injectingadditiveerrors,thetransmitterinjectssynchronization 1 : with finite output alphabet. Hayashi’s results were based upon errors, which has the following advantageous features. v usingtheinformationspectrumapproachoriginatingwithHan • Synchronization error channels are cryptographically i X and Verdu [4]. His results were particularly interesting since favorable 3 to additive noise channels. he included non-asymptotic results on secrecy capacity. Bloch r • Wedonotmakeunverifiablephysicalchannelassump- a and Laneman [5] built upon Csiszar and Hayashi’s work in tions, for either the main or eavesdropper channel. using channel resolvability as a basis for stronger secrecy • Inclusion of wiretap codes has low overhead because results based on variational distance and hold for the most systems already use error correction codes. general of channels, e.g., arbitrary alphabets, memory, etc... Specifically, the system we present in this paper has memory, • Assuming a secret shared source of randomness, we and a countably infinite output alphabet. are guaranteed a stronger version of secrecy then Wyner [1] based on variational distance. The main obstacle to achieving secrecy through wiretap Of course, these advantages must be traded off with some codes over wiretap (or similar) channels [1], [3], [5] is that the secrecy proof hinges on knowing, and guaranteeing undesirable properties, such as, the eavesdropper’s channel characteristics. In practice, the • We communicate at a different rate compared to eavesdropper is typically an adversary and will not reveal her conventional cryptographic systems, where the length channel characteristics. For this reason, we employ a strategy of the plaintext and ciphertext are typically equal. inwhichweassumethatboththeintendeduserandtheeaves- dropper receive (noise-free)1 the same symbols transmitted 2Forfutureconsiderations,wewouldliketoreplacethesecretsharedsource ofrandomnesswithfinitelengthkeysdrawnfromafinitekeyspace. 1Inpractice,thiscanbeguaranteed,evenovernoisychannels,usingerror- 3 Knowledgeoftheplaintext,doesnotguaranteeanexactsolutionforthe correctingcodestoachieveerror-freetransmissionofinformation. correspondingkey. • We must construct resolvability codes for channels insertions and deletions with respective probabilities i and d. with synchronization errors. Let Z be the transmitter output drawn from the alphabet Z, k • In practice, we will require a secret preshared key in where Z = X. The symbols Zk are drawn using the shared place of a secret shared source of randomness. source of randomness, R, as in algorithm 1. A. Outline Data: R,X ,X ,... 1 2 Result: Transmit Z ,Z ... In Section II, we present our transmission scheme, and 1 2 while X at input do an equivalent wiretap channel based on synchronization errors k using R generate a geometric random variable with insertions and deletions. We then apply Bloch and Lane- N ∈{0,1,...} with probability (1−i); man [5] to lower bound the secrecy capacity in Section III. k generate N Bernoulli(1)random variables, k 2 In Section IV, we present hidden Markov techniques that B ,B ,...,B ; 1 2 Nk allow us to estimate and bound the corresponding information using R generate a Bernoulli(d) random variable rates. Due to the inherent difficulty of computing such rates D ; k in closed form for synchronization error channels, e.g., inser- if D ==1 then k tion/deletion channels, in this paper we also reveal a reduced Z =B (cid:107)B (cid:107)...(cid:107)B ; statetechniquetolowerboundthesecrecycapacity.Thelower else k 1 2 Nk bounding technique is constructive in the sense that it reveals Z =X (cid:107)B (cid:107)B (cid:107)...(cid:107)B ; the source distribution that achieves the bound. Finally, in k k 1 2 Nk end Section V, we plot lower bounds for the secrecy capacity of Transmit Z the erasure/deletion, and insertion wiretap channels. k end Algorithm 1: Transmission Scheme B. Notation An alphabet of symbols is denoted by a calligraphic letter, Sincetheintendedreceiverhasaccesstothesharedsource say X. Given two elements a,b ∈ X, we denote their con- of randomness, it can generate N ,N ,... , and D ,D ,... 1 2 1 2 catenation by a(cid:107)b (cid:44) ab. We denote the length zero word, by Therefore it can create a sequence, Y ,Y ,... , by removing 1 2 θ, where a(cid:107)θ =θ(cid:107)a=a. An alphabet X does not necessarily inserted symbols and substituting all deletions by erasures containθ.Ann-foldCartesianproductofX isdenotedbyXn. (denoted by ε), i.e, the alphabet for Y is Y ={ε}∪X. k The alphabet obtained by the n-fold concatenation of symbols (cid:104) (cid:16) (cid:17)(cid:105) from X is denoted by X(n), and by definition X(0) = {θ}. Lemma II.1. E L Z(n) =n· 1−(1−i)d (1−i) Observe if X = {a,b}, then X(2) = {aa,ab,ba,bb}. Given (n) a base alphabet, X, and x1,x2,...,xn ∈ X, we define the Example 1. Insertions and Deletions Let Z denote the length of y =x (cid:107)x (cid:107)...(cid:107)x =x x ...x , as sequence of symbols z generated by the transmitter after pro- 1 2 n 1 2 n cessing the first n input symbols x ,x ,...,x . For example, L(y)=min{k|k ∈{0,1,...} such that y ∈X(k)} (1) 1 2 n if (x ,x ,x ,x )=(0,1,1,0) and the transmitter decides to 1 2 3 4 Given an alphabet X, we form X as the alphabet that consists a) transmit x1 and insert symbols 1,1, b) transmit x2 without of all finite-length concatenations of symbols from X, i.e., any insertions, c) delete x3, and d) transmit x4 and insert a X = (cid:83)∞ X(i). Note, by construction X(n) =X, for n≥1. symbol 0 (see Fig. 3). We have that z(1) =011, z(2) =0111, z(3) =0111,andz(4) =011100.ThusAlicetransmits011100. i=0 After resynchronization, Bob has (y ,y ,y ,y )=(0,1,ε,0). Random variables are denoted by upper-case letters, (X), 1 2 3 4 andtheirrealizationsbylowercaseletters(x).Ifkdenotesdis- x4 = (0,1,1,0) x x x x cretetime,thenX denotesarandomvariableattimek drawn 1 2 3 4 from the alphabetkX. Likewise, X is assumed to be drawn y4 = (0,1,ε,0) 0 1 1 0 k from X. A sequence of random variables X ,X ,...,X is z(4) = 011100 011 1 (cid:1)1 00 1 2 n denoted by Xn, and we have Xn ∈ Xn. A concatenation z1,z2,...,z6 = 0,1,1,1,0,0 011 1 00 of symbols X (cid:107)X (cid:107)...(cid:107)X = X X ...X is denoted by 1 2 n 1 2 n X(n),andwehaveX(n) ∈X(n).Giventworandomvariables, Fig.2. Example1 X and Y, the information-spectrum is a random variable, denoted by I(X;Y). The expected value of I(X;Y) will B. Equivalent Wiretap Channel Model be denoted as I(X;Y) = E[I(X;Y)]. Similarly, H(X) , Anequivalentwiretapchannelmodel,inthetermsofBloch a random variable, denotes the entropy-spectrum, and H(X) andLaneman[5],consistsofthemainchannelbeinganerasure denotes the expected value. We denote the binary entropy channel, and the wiretap channel being an insertion channel function as follows4, h(x)(cid:44)−xlog(x)−(1−x)log(1−x). followed by an ε-deletion channel. We define an ε-deletion channel as a channel that deletes the erasure symbols, ε. In II. TRANSMISSIONWIRETAPMODEL particular, note a deletion channel is equivalent to an erasure A. Transmission Scheme channel followed by an ε-deletion channel. LetXk bethechannelinputdrawnfromthefinitealphabet, III. SECRECYCAPACITY X ={0,1}(seeFig.1).Oursynchronizationerrorchannelhas A main strength of Bloch’s and Laneman’s proof [5] of 4Withoutlossofgenerality,wewillassumealllogarithmsarebase2. secrecy using information spectra, and channel resolvability Xn Yn This is obtained in a similar fashion as in Dobrushin [8], but Erasure Channel again without the sup in eq. 2.7 of [8]. It follows that, Insertion Channel p-limsup 1I(Xn;Z(n))= lim 1I(Xn;Z(n))=I n→∞ n n→∞n Observe the main channel is a discrete memoryless channel. ε-Deletion Channel Furthermore,givenastationaryandergodicinputprocess,the output process is clearly stationary and ergodic. Thus, (n) 1 1 p-liminf I(Xn;Yn)= lim I(Xn;Yn). (5) n→∞ n n→∞n Fig.3. EquivalentDegradedWiretapInsertion/DeletionModel Finally, the inequality in (3) follows because we restricted the input to homogeneous Markov processes. is the generality of their proof. In particular their notation n uses Z for channel output, i.e., the output is synchronized Our intent in looking at the combined insertion/deletion with the channel input. Similarly to Han [9] p. 100, we claim channel, is to balance the output. We would like the ability that the results [5] apply to more general channels, e.g., the to satisfy power requirements of the transmitter or receiver by (n) altering the deletion and insertion probabilities. We shall now synchronization error wiretap model with output Z . look at two contrasting cases, namely i=0 and d=0. Theorem III.1. (Bloch-Laneman Secrecy Capacity)[5] The secrecy capacity, C , of the synchronization error wiretap 1)InsertionWiretapChannel: Thed=0scenarioisattrac- s model for secrecy metric S is, tive if the receiver has power (and computational) restrictions. 2 We see that once in sync with the transmitter, the receiver (cid:20) 1 is only required to demodulate bits declared as non-insertion C = sup p-liminf I(Vn;Yn) s n bits by the shared random source, i.e., Xn = Yn. Thus the {Vn,Xn}n≥1 n→∞ (cid:21) decoder is easy to implement on a low power device. The −p-limsup 1I(Vn;Z(n)) (2) secrecy capacity, which we denote with Csi, is bounded as n n→∞ C ≥ lim 1 (cid:104)I(Xn;Yn)−I(Xn;Z(n))(cid:105) (6) where the process {Vn,Xn}n≥1 satisfies, si n→∞n Vn →Xn →Z(n)Yn ∀n∈Z+. =nl→im∞n1 (cid:104)H(Xn)−H(Z(n))+H(Z(n)|Xn)(cid:105), (7) where Xn is an arbitrary Markov process. Proof: The proof in [5] is written for channels in which 2)Deletion/Erasure Wiretap Channel: For the dele- (n) n Z = Z , and uses two corollaries in Pinsker [10]. Note tion/erasurewiretapmodelwehavei=0.Thisisthedualcase that the two Pinsker[10] corollaries hold for synchronization to the insertion wiretap channel in the sense that the deletion (n) n error channels if we use Z instead of Z . Similarly, Bloch of bits is relatively easy to implement at the transmitter. The and Laneman’s proof that, 3 (cid:23) 4, for secrecy metrics holds. main channel is now a discrete memoryless erasure channel The remainder of their proof relies on results in channel for which there are well known codes, yet the receiver still resolvability from Han [9] which hold for channels with requires some computational power to decode. This is a good synchronization errors. choice if the transmitter is power limited (because it transmits fewer bits) but the receiver has ample power/computational CorollaryIII.2. (LowerBoundfortheSecrecyCapacityofthe resources. The secrecy capacity for this model, denoted by Insertion/Deletion Wiretap Channel Model) Let M represent C , is now bounded as the set of all homogeneous Markov chains defined on the sd alphabet X. Then C ≥ lim 1 (cid:104)I(Xn;Yn)−I(Xn;Z(n))(cid:105) (8) Cs ≥{Xmn}a∈xMnl→im∞n1 (cid:104)I(Xn;Y(n))−I(Xn;Z(n))(cid:105) (3) sd =nnl→→im∞∞nn1 (cid:104)H(Yn)−n·h(d)−H(Z(n))+H(Z(n)|Xn)(cid:105), where Xn is an arbitrary Markov process. Proof:Inourchannelmodelwehaveassumedthewiretap channel is a degraded version of the main channel. Thus the supremum (2) is attained for Vn =Xn. IV. NUMERICALLYBOUNDINGCs If Xn is a homogeneous Markov process, the limit I = In this section we construct numerical techniques to lower lim 1I(Xn;Z(n)) exists and is finite (see general proof boundCsiandCsdoftheinsertionanddeletioneavesdropper’s n→∞ n channels given in the previous section, respectively. We shall in [8], but without sup in equation 2.7). Furthermore, as in assume the input process Xn is an M-th order binary Markov [8], p 17-19, the two sequences, Xn and Z(n), satisfy, process,i.e.,X ={0,1},whose2M×2M transitionprobability lim P (cid:32)(cid:12)(cid:12)(cid:12)I(Xn;Z(n)) −1(cid:12)(cid:12)(cid:12)>δ(cid:33)=0 ,∀δ >0 (4) mZ(antr)ix∈isZP.=ThXe .ouTthpeutmoafinthechaenavneesldr(oXpnper→’s cYhann)neils ias n→∞ (cid:12)(cid:12) nI (cid:12)(cid:12) memoryless erasure channel, and Yn ∈Yn ={ε,0,1}n. A. Computing lim 1H(Xn) 2)DeletionWiretapChannel: Forthedeletionchannel,we n do not have a method to compute the conditional entropy Being Markov, Xn has a closed form entropy rate [7]. rate lim 1H(cid:16)Z(n)|Xn(cid:17), so we resort to a reduced-state n→∞n techniquetolowerboundtheconditionalentropyrate.Forthis B. Computing lim 1H(Yn) n purpose, we first need to formulate an appropriate trellis (with Since Xn is a Markov Process, Yn is a hidden Markov possiblycountablyinfinitenumberofstatespertrellissection), process obtained by passing Xn through an erasure channel thenweapplyanappropriatereduced-statetechniquetoreduce with erasure probability d. The entropy rate of Yn can be the number of states per trellis section to a finite number and evaluated using trellis-based Monte Carlo techniques [11]. guarantee the lower bound on the conditional entropy rate. (cid:16) (cid:17) C. Computing lim 1H Z(n) Prob/z Prob/z n S 1 S 2 S 0 1 2 Using Lemma II.1, we have 0 (1-d) /x1 x (1-d) /x2 x lim 1H(Z(n))= 1−d+di lim 1H(Zk) (9) 1 2 n→∞n 1−i k→∞k where Zk is the sequence of symbols Z at the out- k put of the eavesdropper’s channel. [Note, Zk ∈ Zk = x (1-d) /x3 x {0,1}k, whereas Z(n) ∈Z.]Next,wenoticethatthesequence 2 3 Zk is itself a hidden Markov process with an underlying Markov state sequence Sk whose transition probability matrix is Q (explicitly given further down). That is, (1-d) /x x 4 x 3 4 k (cid:89) Pr(zk|s ,sk)= Pr(z |s ,s )Q(s ,s ) (10) 0 m m−1 m m−1 m m=1 whereQ(s ,s )areentriesinQ.Thestates isdescribed m−1 m m as a binary string. If Xk is a Markov source of order M, then s is a binary string of length M. We denote by l(s ) the Fig.4. ConditionalEntropyRateTrellisfortheDeletionChannel m m last (i.e. the M-th) binary digit of s . With this notation, we m can fully describe the hidden Markov process Zk as follows. Prob/z Prob/z 1)Insertion channel: We have, S0 i/ /0 1 S1 i/ /0 2 S2 2 2 Q=Q =(1−i)P+iI , and (11) 0 0 0 i i/ /1 i/ /1 2 2 Pr(z |s ,s )= (12) m m−1 m Q(smi−/21,sm) if sm−1 =sm and zm (cid:54)=l(sm) x i/2 /0 x 11− Q(smi−/21,sm) ifforsmal−l1ot=hesrmvaalinddpzamirs=(sl(sm),s ) 1 i/2 /1 1 m−1 m 2)Deletion channel: We have, x 2 Q=Q =(1−d)[I−dP]−1P , and (13) d (cid:26) 1 if z =l(s ) Pr(z |s ,s )= m m (14) m m−1 m 0 if z (cid:54)=l(s ) m m SinceZk ishiddenMarkov,itsinformationrateiscomputable Fig.5. ConditionalEntropyRateTrellisfortheInsertionChannel using trellis-based Monte-Carlo techniques [11]. (cid:16) (cid:17) D. Bounding lim 1H Z(n)|Xn n Because of the spatial constraints, we do not fully describe the method for reducing the states, but refer the reader to 1)Insertion Wiretap Channel: For the insertion channel, [12] for general reduced-state techniques for upper bounding there are finitely many possible states for each received z , m entropyrates.Thetrellisesconstructedforcomputingthecon- i.e., (k + 1) possible states at time k (with regards to the (cid:16) (cid:17) receiver). For this paper we did not apply reduced state ditional entropy rate, nl→im∞n1H Z(n)|Xn , for the deletion techniques, which is feasible for k ≈106. and insertion channels have the form as in Fig. 4 and Fig. 5, respectively. Note that the trellises are drawn/constructed only ofrandomnessisnotuniquelydeterminedgivenciphertextand after a realization x ,x ,... becomes available. plaintext. 1 2 The transmission scheme also has advantages with regards E. Optimization of Markov Sources to system design. We have the ability to choose insertion Any Markov source will result in a lower bound for C and deletion probabilities to satisfy receiver, and transmitter si and C . The best lower bound is obtained by maximizing the requirements. In particular, we see in a system with powerful sd boundforvaryingMarkovordersandMarkovchainparameter base stations, and lightweight distributed sensors, the base P. This optimization can be done by a generalized Blahut- stations can transmit based on d = 0 < i < 1 and the Arimotoalgorithm[13][14]adaptedtowiretapchannels[15]. lightweight sensors can transmit based on i = 0 < d < 1. Thus the lightweight sensors transmit less, and decrypt easier, while the base stations can transmit more, and decode using V. NUMERICALRESULTS computationally intensive error correction decoders. Inthispaper,wepresentonlynumericalresults(seeFig.6) forMarkovprocessesoforderM =1,therebywedonotresort REFERENCES to Blahut-Arimoto-type optimizations because an exhaustive [1] A. D. Wyner. ”The wire-tap channel,” Bell System Technical Journal, searchoptimizationisfeasibleforlow-orderMarkovprocesses. 54:13551387,Oct1975. For both the insertion and deletion channel we averaged 100 [2] M.Hayashi,”Generalnonasymptoticandasymptoticformulasinchannel simulations that were run on a trellis of length 105. resolvability and identification capacity and their application to the wiretapchannel,”IEEETrans.Inf.Theory,vol.52,no.4,pp.15621575, We see that although lim C =1, in our transmission i→1 si April2006. scheme this is of little help since as i→1 , we are unable to [3] I. Csiszar and J. Korner. ”Broadcast channels with confidential mes- transmit to the intended user. Thus we must also look at the sages,” Information Theory, IEEE Transactions on, 24(3):339348, May effective transmission rate, R , which we define as, 1978. E n [4] T. Han and S. Verdu,”Approximation theory of output statistics,” IEEE RE = E(cid:104)L(cid:16)Z(n)(cid:17)(cid:105) (15) [5] TMra.nBs.loIcnhf.aTnhdeoJr.y,Nv.olL.a3n9e,mnaon.,3”,Oppn.t7h5e27s7ec2r,eMcyayca1p9a9c3it.y of arbitrary wiretap channels,” in Proc. 46th Allerton Conf. Commun., Control, Observe, for our system we have R = (1−i) , and for Comput.,Monticello,IL,USA,pp.818825,Sep.2008. E 1−(1−i)d [6] M.J.Mihaljevic,”AnApproachforLight-WeightEncryption Employing i = 1 and d = 0, R = 0. Thus even though with regards E DedicatedCoding”,IEEEGLOBECOM2012,AnaheimCA,USA,03- to our equivalent wiretap channel model, we have Csi = 1, 07Dec.2012,Proceedings,pp.892-898. our implementation has RE = 0, and we can therefore not [7] T.CoverandJ.Thomas,ElementsofInformationTheory,Wiley&Sons, communicate with the intended user. NewYork,Secondedition,2006. [8] R.L.Dobrushin,”Shannonstheoremsforchannelswithsynchronization errors,”ProblemsInform.Transmission,vol.3,no.4,pp.1126,1967. x]k 1 [9] T. S. Han, ”Information-Spectrum Methods in Information Theory,” ol 0.9 Springer,2002. b m 0.8 Deletion Channel (FOM) [10] M. S. Pinsker, ”Information and Information Stability of Random sy Insertion Channel (FOM) VariablesandProcesses,”HoldenDay,1964. s per 00..67 [11] ”SDi.mAulrantoioldn,-bAas.eKdacvocmic,puHta.-tAio.nLoofeliingfeorr,mPa.tiOon. Vroatnetso:beUl,ppaenrdaWnd. Zloewnegr, bit 0.5 bounds,” Proc. 2003 IEEE Int. Symp. Information Theory, Yokohama, ate [ 0.4 Japan,p.231,Jun./Jul.2003. n r 0.3 [12] A. Kavcic and R. H. Motwani, ”Insertion/deletion channels: Reduced o StateLowerBoundsonChannelCapacities,”Proc.IEEEInt.Symp.Inf. mati 0.2 Theory,p.229,2004. or 0.1 [13] A. Kavcic, ”On the capacity of markov sources over noisy channels,” Inf 00 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 inProc.ofIEEEGLOBECOM,SanAntonio,TX,pp.2997-3001,2001. Deletion/Insertion Probability [14] P. O. Vontobel , A. Kavcic , D. Arnold and H.-A. Loeliger ”A generalizationoftheBlahut-Arimotoalgorithmtofinite-statechannels”, IEEETrans.Inf.Theory,vol.54,no.5,pp.1887-1918,2008. Fig. 6. Secrecy Capacity (Lower Bound) vs Deletion/Insertion Probability forfirstorderMarkov(FOM)inputs [15] Y.Sankarasubramaniam,A.ThangarajandK.Viswanathan”Finite-state wiretap channels:Secrecy under memory constraints”, Proc. IEEEInf. TheoryWorkshop,pp.115-119,2009. VI. CONCLUSION [16] L. R. Bahl, J. Cocke, F. Jelinek and J. Raviv, ”Optimal decoding of linear codes for minimizing symbol error rate”, IEEE Trans. on Wehavepresentedasecrecysystembasedonsynchroniza- InformationTheory,pp.284-287,March1974. tionerrors,andprovidedtechniquestolowerboundthesecrecy [17] Miller,Frank(1882).Telegraphiccodetoinsureprivacyandsecrecyin capacity. Furthermore, we have evaluated these bounds (using thetransmissionoftelegrams.C.M.Cornwell. first order Markov input) for two important instantiations of [18] C. E. Shannon. ”Communication theory of secrecy systems,” Bell SystemTechnicalJournal,28(4):656715,1949. the model. The proposed system is a complimentary example to the one time pad [17] [18], in the sense that it is a method to use a shared source of randomness and achieve information theoreticsecurity,yetitisnotastreamcipher.Themethodhas severaladvantagesoveraonetimepad,e.g.,thesharedsource