### Abstract

We consider the question of how much information can be stored by labeling the vertices of a connected undirected graph G using a constant-size set of labels, when isomorphic labelings are not distinguishable. An exact information-theoretic bound is easily obtained by counting the number of isomorphism classes of labelings of G, which we call the information-theoretic capacity of the graph. More interesting is the effective capacity of members of some class of graphs, the number of states distinguishable by a Turing machine that uses the labeled graph itself in place of the usual linear tape. We show that the effective capacity equals the information-theoretic capacity up to constant factors for trees, random graphs with polynomial edge probabilities, and bounded-degree graphs.

Original language | English (US) |
---|---|

Title of host publication | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |

Pages | 573-587 |

Number of pages | 15 |

Volume | 6366 LNCS |

DOIs | |

State | Published - 2010 |

Event | 12th International Symposium on Stabilization, Safety, and Security of Distributed Systems, SSS 2010 - New York, NY, United States Duration: Sep 20 2010 → Sep 22 2010 |

### Publication series

Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|

Volume | 6366 LNCS |

ISSN (Print) | 03029743 |

ISSN (Electronic) | 16113349 |

### Other

Other | 12th International Symposium on Stabilization, Safety, and Security of Distributed Systems, SSS 2010 |
---|---|

Country | United States |

City | New York, NY |

Period | 9/20/10 → 9/22/10 |

### Fingerprint

### ASJC Scopus subject areas

- Computer Science(all)
- Theoretical Computer Science

### Cite this

*Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)*(Vol. 6366 LNCS, pp. 573-587). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 6366 LNCS). https://doi.org/10.1007/978-3-642-16023-3_44

**Storage capacity of labeled graphs.** / Angluin, Dana; Aspnes, James; Bazzi, Rida; Chen, Jiang; Eisenstat, David; Konjevod, Goran.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).*vol. 6366 LNCS, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 6366 LNCS, pp. 573-587, 12th International Symposium on Stabilization, Safety, and Security of Distributed Systems, SSS 2010, New York, NY, United States, 9/20/10. https://doi.org/10.1007/978-3-642-16023-3_44

}

TY - GEN

T1 - Storage capacity of labeled graphs

AU - Angluin, Dana

AU - Aspnes, James

AU - Bazzi, Rida

AU - Chen, Jiang

AU - Eisenstat, David

AU - Konjevod, Goran

PY - 2010

Y1 - 2010

N2 - We consider the question of how much information can be stored by labeling the vertices of a connected undirected graph G using a constant-size set of labels, when isomorphic labelings are not distinguishable. An exact information-theoretic bound is easily obtained by counting the number of isomorphism classes of labelings of G, which we call the information-theoretic capacity of the graph. More interesting is the effective capacity of members of some class of graphs, the number of states distinguishable by a Turing machine that uses the labeled graph itself in place of the usual linear tape. We show that the effective capacity equals the information-theoretic capacity up to constant factors for trees, random graphs with polynomial edge probabilities, and bounded-degree graphs.

AB - We consider the question of how much information can be stored by labeling the vertices of a connected undirected graph G using a constant-size set of labels, when isomorphic labelings are not distinguishable. An exact information-theoretic bound is easily obtained by counting the number of isomorphism classes of labelings of G, which we call the information-theoretic capacity of the graph. More interesting is the effective capacity of members of some class of graphs, the number of states distinguishable by a Turing machine that uses the labeled graph itself in place of the usual linear tape. We show that the effective capacity equals the information-theoretic capacity up to constant factors for trees, random graphs with polynomial edge probabilities, and bounded-degree graphs.

UR - http://www.scopus.com/inward/record.url?scp=78249259689&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=78249259689&partnerID=8YFLogxK

U2 - 10.1007/978-3-642-16023-3_44

DO - 10.1007/978-3-642-16023-3_44

M3 - Conference contribution

AN - SCOPUS:78249259689

SN - 3642160220

SN - 9783642160226

VL - 6366 LNCS

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 573

EP - 587

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

ER -