System Coding for Error Correction and Security .


35 views
Uploaded on:
Category: Product / Service
Description
Network Coding for Error Correction and Security. Raymond W. Yeung The Chinese University of Hong Kong. Introduction Network Coding vs Algebraic Coding Network Error Correction Secure Network Coding Applications of Random Network Coding in P2P Concluding Remarks . Outline. Introduction.
Transcripts
Slide 1

Arrange Coding for Error Correction and Security Raymond W. Yeung The Chinese University of Hong Kong

Slide 2

Introduction Network Coding versus Algebraic Coding Network Error Correction Secure Network Coding Applications of Random Network Coding in P2P Concluding Remarks Outline

Slide 3

Introduction

Slide 4

A Network Coding Example The Butterfly Network

Slide 5

b 1 b 2 b 1 b 2 b 1 b 2 b 1 b 2 b 1 b 2 b 1 b 2 b 1 b 1 + b 2 b 2 b 2 b 1 b 1 + b 2 b 1 + b 2

Slide 6

A Network Coding Example with Two Sources

Slide 7

b 1 b 2 b 1 b 2 b 1 + b 2 b 2 b 2 b 2 b 1 b 1 b 1 b 1 + b 2 b 1 + b 2 b 2 b 1

Slide 8

b 1 b 2 b 1 t = 1 b 2 t = 2 b 1 + b 2 b 1 + b 2 t = 3 Wireless/Satellite Application half putting something aside for downlink transfer speed!

Slide 9

When there is 1 source to be multicast in a system, store-and-forward may neglect to enhance transfer speed. At the point when there are at least 2 free sources to be transmitted in a system (notwithstanding for unicast ), store-and-forward may neglect to advance data transfer capacity. So, Information is NOT an item! Two Themes of Network Coding

Slide 10

A system is spoken to by a chart G = ( V , E ) with hub set V and edge (channel) set E . An image from a letters in order F can be transmitted on every channel. There can be different edges between a couple of hubs. Model of a Point-to-Point Network

Slide 11

The source hub s creates a data vector x = ( x 1 x 2 … x k )  F k . What is the condition for a hub t to have the capacity to get the data vector x ? Max-Flow Bound. In the event that maxflow( t ) < k , then hub t can\'t in any way, shape or form get x . Single-Source Network Coding

Slide 12

If arrange coding is permitted, a hub t can get the data vector x iff maxflow( t ) ≥ k i.e., the maximum stream bound can be accomplished all the while by every such hub t . (Ahlswede et al. 00) Moreover, this can be accomplished by straight system coding for an adequately extensive base field. (Li, Y and Cai, Koetter and Medard, 03) The Basic Results

Slide 13

Recall that x = ( x 1 x 2 … x k ) is the multicast message. For every channel e , relegate a section vector f e with the end goal that the image sent on channel e is x f e . The vector f e is known as the worldwide encoding part of channel e . The worldwide encoding bit of a channel is practically equivalent to a section in the generator lattice of an established piece code. The worldwide encoding piece of a yield channel at a hub must be a straight blend of the worldwide encoding bits of the info channels. Worldwide Encoding Kernels of a Linear Network Code

Slide 14

An Example k = 2, let x = ( b 1 , b 2 )

Slide 15

b 1 b 2 b 1 b 2 b 1 b 1 + b 2 b 2 b 1 + b 2 b 1 + b 2

Slide 16

Network Coding versus Algebraic Coding

Slide 17

A message of k images from a base field F is produced at the source hub s . A k - dimensional direct multicast has the accompanying property: A non-source hub t can translate the message effectively if and just if maxflow( t )  k . By the Max-stream bound, this is likewise a fundamental condition for a hub t to disentangle (for any given system code). Hence the snugness of the Max-stream bound is accomplished by a straight multicast, which dependably exists for adequately expansive base fields . A Linear Multicast

Slide 18

Consider a ( n , k ) established piece code with least separation d. See it as a system code on a combination arrange. Since the ( n , k ) code can redress d-1 eradications, every one of the hubs at the base can interpret. A ( n , k ) Code with d min = d

Slide 19

The Combination Network s n - d +1 n - d +1

Slide 20

For the hubs at the base, maxflow( t ) = n - d +1. By the Max-stream bound, k  maxflow( t ) = n - d +1 or d  n - k +1, the Singleton bound . Along these lines, the Singleton bound is a unique instance of the Max-stream destined for system coding. A MDS code is a traditional square code that accomplishes snugness of the Singleton bound. Since a direct multicast accomplishes snugness of the Max-stream bound, it is formally a system speculation of a MDS code.

Slide 21

The beginning stage of traditional coding hypothesis and data theoretic cryptography is the presence of a channel through which we can transmit data from Point A to Point B without blunder. Single-source system coding gives another such conductor. Subsequently, we expect that both established coding hypothesis and data theoretic cryptography can be stretched out to systems. Two Ramifications of Single-Source Network Coding

Slide 22

Network Error Correction

Slide 23

Classical mistake redressing codes are formulated for indicate point interchanges. Such codes are connected to systems on a connection by-connection premise. Indicate Point Error Correction in a Network

Slide 24

Channel Decoder Channel Decoder Network Encoder Channel Encoder

Slide 25

Observation Only the getting hubs need to know the message transmitted; the quick hubs don\'t. As a rule, channel coding and system coding don\'t should be isolated  Network Error Correction Network blunder amendment sums up established indicate point mistake remedy. A Motivation for Network Error Correction

Slide 26

Network Codec

Slide 27

A circulated mistake revising plan over the system. Does not unequivocally disentangle at moderate hubs as in indicate point mistake revision. At a sink hub t , if c mistakes can be revised, it implies that the transmitted message can be decoded effectively the length of the aggregate number of blunders, which can happen anyplace in the system, is at most c . What Does Network Error Correction Do?

Slide 28

Classical Algebraic Coding y = x + z blunder vector got vector codeword y , x , and z are all in similar space.

Slide 29

Hamming separation is the most normal separation measure. For a code C , d min = min d ( v 1 , v 2 ), where v 1 , v 2  C and v 1  v 2 . In the event that d min = 2 c+ 1, then C can Correct c blunders Detect 2 c mistakes Correct 2 c eradications Minimum Distance: Classical Case

Slide 30

 Sphere Packing d min

Slide 31

Upper limits Hamming bound Singleton bound Lower bound Gilbert-Varsharmov bound Coding Bounds: Classical Case

Slide 32

Network Coding t y t x y u s y v z

Slide 33

The system code is determined by the nearby encoding bits at each non-source hub. Alter a sink hub t . The codeword x , the blunder vector z , and the got vectors y t are all in various spaces. In this instructional exercise, we consider just straight system codes. At that point y t = x F s,t + z F t where F s,t and F t rely on upon t . In the traditional case, F s,t = F t are the personality grid. Input/Output Relation

Slide 34

The system Hamming separation can be characterized for direct system codes. Numerous ideas in polynomial math coding in light of the Hamming separation can be reached out to network coding. Remove Properties of Linear Network Codes (Yang, Y, Zhang 07)

Slide 35

Fix both the system code and the codebook C , i.e., the arrangement of all conceivable codewords transmitted into the system. For a sink hub t, y t ( x , z ) = x F s,t + z F t For two codewords x 1 , x 2  C , characterize their separation by D t msg ( x 1 , x 2 ) = arg min z w H ( z ) where the base is assumed control over all blunder vectors z with the end goal that y t ( x 1 ,0) = y t ( x 2 , z ) , or y t ( x 1 , z ) = y t ( x 2 ,0) Idea D t msg ( x 1 , x 2 ) is the base Hamming weight of a mistake vector z that makes x 1 and x 2 undefined at hub t . D t msg characterizes a metric on the info space of the direct system code. How to Measure the Distance between Two Codewords?

Slide 36

For a sink hub t , d min, t = min x 1  x 2 D t msg ( x 1 , x 2 ) Each sink hub has an alternate perspective of the codebook as each is connected with an alternate separation measure. d min, t is the base separation as observed by sink hub t . In the event that the codebook C is straight, d min, t has the accompanying proportional definition: d min, t = min { w H ( z ) : z  A t } where A t = { z : y t ( x , z ) = 0 for some x  C }. Least Distance for a Sink Node

Slide 37

If d min, t = 2 c +1, then sink hub t can Correct c mistakes Detect 2 c blunders Correct 2 c eradications Some type of "circle pressing" is grinding away. Substantially more confused when the system code is nonlinear. Mistake Correction/Detection and Erasure Correction for a Linear Network Code

Slide 38

 Sphere Packing d min

Slide 39

In system coding, some blunder designs have no impact on the sink hubs. These are "imperceptible" blunder designs that can\'t be (or don\'t should be) distinguished. Likewise called "Byzantine adjustment recognition" (Ho et al, ISIT 04) Remark on Error Detection

Slide 40

In traditional arithmetical coding, deletion revision has three proportionate translation: An image is eradicated implies that it is not accessible An image is deleted implies that the eradication image is gotten The blunder areas are known. In our specific situation, eradication remedy implies that the areas of the mistakes are known by the sink hubs yet not the moderate hubs. Comment on Erasure Correction

Slide 41

Cai & Y (02, 06) got the Hamming bound, the Singleton bound and the Gilbert-Varshamov destined for system codes. These limits are normal augmentation of the limits for logarithmic codes. Give the base field a chance to be GF ( q ), n = min t maxflow( t ) and d min = min t d min, t Coding Bounds for Network Codes

Slide 42

Hamming bound where . Singleton bound The Singleton bound is asymptotically tight, i.e., when q is adequately substantial. Upper Bounds

Slide 43

Observation Sink hubs with bigger most extreme stream can have better blunder revision ability. For a given straight system code, refined Hamming limits and Singleton limits particular to the individual sink hubs can be gotten. Refined Coding Bounds .:t

Recommended
View more...