The introduction of the double cut and join (DCJ) operation and the derivation of its associated distance caused a flurry of research into the study of multichromosomal rearrangements. However, little of this work has incorporated indels (i.e., insertions and deletions) into the calculation of genomic distance functions, with a particular exception of Braga et al., who provided a linear time algorithm ([1]) for computing the DCJ-indel distance. Although this algorithm only takes linear time, its derivation is lengthy and depends on a large number of possible cases. In this paper, we provide a simplified indel model that solves the problem of DCJ-indel sorting in linear time directly from the classical breakpoint graph, an approach that allows us to describe the solution space of DCJ-indel sorting, thus resolving an existing open problem.
The double-cut-and-join operation (DCJ) is a fundamental graph operation that is used to model a variety of genome rearrangements. However, DCJs are only useful when comparing genomes with equal (or nearly equal) gene content. One obvious extension of the DCJ framework supplements DCJs with insertions and deletions of chromosomes and chromosomal intervals, which implies a model in which DCJs receive unit cost, whereas insertions and deletions receive a nonnegative cost of ω. This paper proposes a unified model finding a minimum-cost transformation of one genome (with circular chromosomes) into another genome for any value of ω. In the process, it resolves the open case ω > 1.