|Computer Science/Discrete Mathematics Seminar I|
|Topic:||Locally Repairable Codes, Storage Capacity and Index Coding|
|Affiliation:||University of Massachusetts, Amherst|
|Date:||Monday, February 5|
|Time/Room:||11:00am - 12:15pm/Simonyi Hall 101|
An error-correcting code is called locally repairable if any coordinate of a codeword can be recovered by accessing only few other coordinates. For locally repairable codes over small alphabets (such as binary), the optimal trade-off between redundancy and minimum distance of a code is unknown. In this talk we will present the tightest possible bounds on the redundancy of a locally repairable code of a given minimum distance. For general codes, the lower bound (converse) follows from a simple shortening argument. Only mild improvements on this shortening bound were possible recently for linear codes, using the theory of coset leader graphs. For upper bounds (achievability), a Gilbert-Varshamov-type bound is possible.
The local repair problem can be generalized naturally to a graph: suppose each vertex of a graph stores a symbol from some alphabet, and content of any vertex can be recovered from accessing the stored content of its neighbors. The storage capacity of the graph is defined to be the maximum amount of information that can be stored in the graph with this constraint. This problem is closely related to a canonical network coding problem called index coding. In index coding, given a graph, each vertex represents a user. Each user wants to know the value of a variable, and the users know the values of variables their neighbors want. How many bits of information a broadcaster must send now so that everyone knows what they want? This quantity is called the index coding rate. We show some results concerning computation of storage capacity and the index coding rate.