Article ID Journal Published Year Pages File Type
436548 Theoretical Computer Science 2013 18 Pages PDF
Abstract

We use an information-theoretic notion, namely, (Shannon) information rate, to generalize common syntactic similarity metrics (like Hamming distance and longest common subsequences) between strings to ones between languages. We show that the similarity metrics between two regular languages are computable. We further study self-similarity of a regular language under various similarity metrics. As far as semantic similarity is concerned, we study the amplitude of an automaton, which intuitively characterizes how much a typical execution of the automaton fluctuates. Finally, we investigate, through experiments, how to measure similarity between two real-world programs using Lempel–Ziv compression on the runs at the assembly level.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics